
krgtech jobs
Job Role: Kafka developer
Location : : remote
Experience – 13 years
Job Description:
We are seeking a highly skilled Onsite Kafka Developer with expertise in Confluent Kafka and hands-on experience in implementing Change Data Capture (CDC) solutions, specifically from Facets to Datahub. The ideal candidate will have a strong background in Java 8+, Spring Framework, Microservices, and Kafka-based architectures, along with experience in designing and developing scalable, resilient, and high-performance systems.
As a Kafka Developer, you will play a pivotal role in designing, developing, and maintaining Kafka-based solutions, ensuring seamless data integration and real-time data processing. You will also collaborate with cross-functional teams to troubleshoot issues, optimize performance, and guide the team in adopting best practices.
Key Responsibilities:
• Design and implement Change Data Capture (CDC) solutions using Confluent Kafka, specifically integrating Facets with Datahub.
• Develop and maintain Microservices using Spring Boot, ensuring REST APIs are secure, scalable, and well-documented.
• Implement authentication and authorization mechanisms for REST APIs using Spring Security (OAuth2/JWT).
• Build and optimize Kafka-based architectures for real-time data streaming and processing.
• Develop reusable microservice libraries, utilities, and archetypes to accelerate development and ensure consistency across the team.
• Collaborate with DevOps teams to deploy and manage applications in Docker and Kubernetes environments.
• Monitor and troubleshoot Kafka clusters and microservices using ELK/EFK (Elastic, FluentBit, Kibana) stack.
• Work with cloud platforms like AWS and Azure to deploy and manage Kafka and microservice-based solutions.
• Lead troubleshooting efforts in the microservice landscape, coordinating with team members to resolve issues efficiently.
• Mentor and guide junior developers, ensuring adherence to best practices and technical standards.
Preferred Qualifications:
• Certification in Confluent Kafka or related technologies.
• Experience with event-driven architectures and real-time data processing.
• Knowledge of data governance tools like Datahub.
• Familiarity with CI/CD pipelines (Jenkins, GitLab CI, etc.).
To apply for this job email your details to Ashwin@krgtech.com