Kafka jobs
Location: Chicago
Duration: Long term
This individual will serve as the technical lead for streaming projects and must bring deep,
practical expertise in Confluent Flink in addition to the broader Kafka ecosystem with below
expectations:
• Expert-level architecture and implementation experience of Flink applications on
Confluent Platform, specifically for high-volume, low-latency stream processing.
• Extensive experience architecting, implementing, and administering the Confluent
Cloud Kafka and Flink platform in production environments.
• Advanced proficiency in core Flink concepts including state management
(Keyed/Operator State, RocksDB), Exactly-Once semantics, and configuring
checkpointing and savepoints for fault tolerance.
• Deep knowledge of Event Time processing, Watermarks (Bounded Out-of-
Orderness), and complex Windowing (Tumbling, Sliding, Session) for accurate
stream analytics.
• Advanced knowledge of KSQL DB and KStreams for rapid development of real-time
stream processing/analytics alongside Flink.
• Proven proficiency in Kafka Connectors (including Change Data Capture/CDC) from
configuration to end-to-end integration in cloud environments.
• Demonstrated experience applying Flink and Kafka in the Retail Industry for use
cases such as real-time inventory management, dynamic pricing, fraud detection,
and personalized customer experience (e.g., clickstream analysis).
• Strong background in platform governance: schema registry, RBAC, audit logging,
retention, and compliance.
• Deep expertise with Terraform and the Confluent Terraform provider; adherence to
Infrastructure-as-Code (IaC) methodology and automation.
To apply for this job email your details to ananda@aditi-llc.com