Job Location : Bangalore/Pune/Mumbai/Hyderabad/Chennai/Coimbatore
Mode : Full Time
Duration : 6 months (can extend to 1 year)
Availability : Immediate
Key Responsibilities:
1. Design, develop, and implement real-time streaming applications using Kafka Streams API.
2. Integrate Kafka Streams applications with upstream and downstream systems (databases, APIs, microservices, etc.).
3. Build, test, and deploy scalable, fault-tolerant streaming pipelines.
Optimize Kafka Streams applications for low latency, high throughput, and resiliency.
4. Collaborate with data engineering teams to ensure data quality, schema consistency, and governance.
5. Troubleshoot production issues, monitor Kafka clusters, and ensure high availability.
6. Work with microservices architecture to integrate Kafka Streams with REST APIs or event-driven systems.
7. Implement best practices for logging, monitoring, and alerting of streaming jobs.
8. Stay updated on the latest Kafka ecosystem tools (e.g., ksqlDB, Kafka Connect, Schema Registry, Confluent Platform).
Required Skills & Qualifications:
1. Bachelor’s degree in Computer Science, Engineering, or related field.
2. 5–7 years of experience in backend or data engineering roles, with at least 4+ years in Kafka/Kafka Streams.
5. Strong hands-on experience in Apache Kafka (brokers, topics, partitions, producers/consumers, offsets, etc.).
6. Proficiency in Java/Scala (Kafka Streams API) or Python (Faust/streaming frameworks).
7. Solid understanding of event-driven architecture and real-time data processing.
8. Experience with distributed systems, fault tolerance, and scalability.
9. Familiarity with schema management tools (e.g., Avro, Protobuf, Confluent Schema Registry).
10. Experience with Docker, Kubernetes, and CI/CD pipelines is a plus.
11. Knowledge of ksqlDB, Kafka Connect, Flink, or Spark Streaming is a strong advantage.