
Make your Apache Kafka operational and development faster by typing less.

Make your Apache Kafka operational and development faster by typing less.

A second command-line tool has been added to ktools. The command kafka-console-consumer-filter allows for filtering JSON,
JSON Schema, and Avro serialized messages by JSON Path expressions. In addition to filtering a message, JSON Paths
can also be used for highlighting elements within the JSON.

Truncating an Apache Kafka Topic is not an easy task. Typically, I have done this by deleting and recreating a topic,
which isn’t always easy or possible. While truncating a topic is something that should never be done in production,
it is quite useful in development. With this in mind, kafka-topic-truncate CLI was created.

Join Kris Jenkins and Neil Buesing discussing Apache Kafka on Kris’ podcast Developer Voices.

So many applications leverage environment variables for configuration, especially when they are deployed within a container. With just a little bit of code, you can leverage the same behavior for your Java Kafka clients.

If you are an Apache Kafka developer looking to write stream-processing applications in Flink, the initial setup isn’t so obvious. Apache Flink has their own opinions on consuming and producing to Kafka along with its integration with Confluent’s Schema Registry. Here are steps and a working example of Apache Kafka and Apache Flink streaming platform up in no time.