A second command-line tool has been added to ktools. The command kafka-console-consumer-filter
allows for filtering JSON,
JSON Schema, and Avro serialized messages by JSON Path expressions. In addition to filtering a message, JSON Paths
can also be used for highlighting elements within the JSON.
Insights
Truncating an Apache Kafka Topic is not an easy task. Typically, I have done this by deleting and recreating a topic,
which isn’t always easy or possible. While truncating a topic is something that should never be done in production,
it is quite useful in development. With this in mind, kafka-topic-truncate
CLI was created.
Join Kris Jenkins and Neil Buesing discussing Apache Kafka on Kris’ podcast Developer Voices.
So many applications leverage environment variables for configuration, especially when they are deployed within a container. With just a little bit of code, you can leverage the same behavior for your Java Kafka clients.
If you are an Apache Kafka developer looking to write stream-processing applications in Flink, the initial setup isn’t so obvious. Apache Flink has their own opinions on consuming and producing to Kafka along with its integration with Confluent’s Schema Registry. Here are steps and a working example of Apache Kafka and Apache Flink streaming platform up in no time.
Not all Kafka integration tools are the same. Some integration systems only produce JSON data without a schema. The JDBC Sink Connector requires a schema. Here are steps showcasing a low-code option to push events into a relational database when the source data is schema-less JSON.