The process of triggering a notification usually relies on a single explicit request from the client application. However, how do you handle a data stream that requires filtering, with notifications sent only after specific conditions are met?
At ING, we use Flink SQL to convert heavy Java logic into streamlined queries using SQL-like syntax. This allows us to perform complex data transformations.
During the live-coding session, you will see how easy it is to filter Kafka events with the help of Apache Flink. We will transition from a Java based application to SQL like queries to demonstrate development time and performance savings. Using Flink in our demo, we will consume events from an input topic, filter them based on specific rules and publish the resulting events on an output Kafka topic.