𝗖𝗼𝗿𝗲 𝗦𝗸𝗶𝗹𝗹𝘀 𝗪𝗲’𝗿𝗲 𝗟𝗼𝗼𝗸𝗶𝗻𝗴 𝗙𝗼𝗿 (𝗠𝘂𝘀𝘁-𝗛𝗮𝘃𝗲)
✔️ 3+ years with 𝗔𝗽𝗮𝗰𝗵𝗲 𝗞𝗮𝗳𝗸𝗮 / 𝗖𝗼𝗻𝗳𝗹𝘂𝗲𝗻𝘁 𝗞𝗮𝗳𝗸𝗮 𝗶𝗻 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁, 𝗨𝗔𝗧, 𝗮𝗻𝗱 𝗣𝗿𝗼𝗱𝘂𝗰𝘁𝗶𝗼𝗻 𝗦𝘂𝗽𝗽𝗼𝗿𝘁
✔️ Proven expertise in 𝗞𝗮𝗳𝗸𝗮 𝗮𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁𝘂𝗿𝗲 including brokers, topics, partitions, offsets, and cluster design
✔️ Hands-on experience with 𝗖𝗼𝗻𝗳𝗹𝘂𝗲𝗻𝘁 𝗖𝗼𝗻𝘁𝗿𝗼𝗹 𝗖𝗲𝗻𝘁𝗲𝗿 for monitoring, managing, and optimizing Kafka clusters
✔️ Strong development experience building 𝗞𝗮𝗳𝗸𝗮 𝗣𝗿𝗼𝗱𝘂𝗰𝗲𝗿𝘀, 𝗖𝗼𝗻𝘀𝘂𝗺𝗲𝗿𝘀, 𝗮𝗻𝗱 𝗦𝘁𝗿𝗲𝗮𝗺 𝗣𝗿𝗼𝗰𝗲𝘀𝘀𝗶𝗻𝗴 𝗮𝗽𝗽𝗹𝗶𝗰𝗮𝘁𝗶𝗼𝗻𝘀
✔️ Expertise in 𝗞𝗮𝗳𝗸𝗮 𝗦𝘁𝗿𝗲𝗮𝗺𝘀 for real-time, fault-tolerant stream processing
✔️ Hands-on experience using 𝗞𝗦𝗤𝗟𝗗𝗕 for real-time analytics, transformations, and stream processing
✔️ Strong understanding and implementation experience with 𝗞𝗮𝗳𝗸𝗮 𝗖𝗼𝗻𝗻𝗲𝗰𝘁 for integrating external data sources and sinks
✔️ Ability to design and deliver 𝘀𝗰𝗮𝗹𝗮𝗯𝗹𝗲, 𝘀𝗲𝗰𝘂𝗿𝗲, 𝗮𝗻𝗱 𝗵𝗶𝗴𝗵-𝗽𝗲𝗿𝗳𝗼𝗿𝗺𝗮𝗻𝗰𝗲 𝗲𝘃𝗲𝗻𝘁-𝗱𝗿𝗶𝘃𝗲𝗻 𝘀𝗼𝗹𝘂𝘁𝗶𝗼𝗻𝘀
✔️ Experience executing and supporting 𝗨𝗔𝗧 𝗮𝗰𝘁𝗶𝘃𝗶𝘁𝗶𝗲𝘀, including test validation, defect resolution, and deployment readiness
✔️ Strong production support experience including 𝗺𝗼𝗻𝗶𝘁𝗼𝗿𝗶𝗻𝗴, 𝘁𝗿𝗼𝘂𝗯𝗹𝗲𝘀𝗵𝗼𝗼𝘁𝗶𝗻𝗴, 𝗽𝗲𝗿𝗳𝗼𝗿𝗺𝗮𝗻𝗰𝗲 𝘁𝘂𝗻𝗶𝗻𝗴, 𝗮𝗻𝗱 𝗶𝘀𝘀𝘂𝗲 𝗿𝗲𝘀𝗼𝗹𝘂𝘁𝗶𝗼𝗻
✔️ Experience handling 𝗞𝗮𝗳𝗸𝗮 𝗹𝗮𝘁𝗲𝗻𝗰𝘆, 𝘁𝗵𝗿𝗼𝘂𝗴𝗵𝗽𝘂𝘁, 𝗱𝗮𝘁𝗮 𝗶𝗻𝘁𝗲𝗴𝗿𝗶𝘁𝘆, 𝗮𝗻𝗱 𝗮𝘃𝗮𝗶𝗹𝗮𝗯𝗶𝗹𝗶𝘁𝘆 𝗶𝘀𝘀𝘂𝗲𝘀
✔️ Hands-on exposure to 𝗰𝗹𝘂𝘀𝘁𝗲𝗿 𝘂𝗽𝗴𝗿𝗮𝗱𝗲𝘀, 𝗽𝗮𝘁𝗰𝗵𝗶𝗻𝗴, 𝗯𝗮𝗰𝗸𝘂𝗽𝘀, 𝗮𝗻𝗱 𝗿𝗼𝘂𝘁𝗶𝗻𝗲 𝗞𝗮𝗳𝗸𝗮 𝗺𝗮𝗶𝗻𝘁𝗲𝗻𝗮𝗻𝗰𝗲
✔️ Proficiency in programming languages such as 𝗝𝗮𝘃𝗮, 𝗣𝘆𝘁𝗵𝗼𝗻, 𝗼𝗿 𝗦𝗰𝗮𝗹𝗮
✔️ D𝗶𝘀𝘁𝗿𝗶𝗯𝘂𝘁𝗲𝗱 𝘀𝘆𝘀𝘁𝗲𝗺𝘀, 𝗺𝗶𝗰𝗿𝗼𝘀𝗲𝗿𝘃𝗶𝗰𝗲𝘀, 𝗮𝗻𝗱 𝗲𝘃𝗲𝗻𝘁-𝗱𝗿𝗶𝘃𝗲𝗻 𝗮𝗿𝗰𝗵𝗶𝘁𝗲𝗰𝘁𝘂𝗿𝗲𝘀
✔️ 𝗖𝗜/𝗖𝗗 𝗽𝗶𝗽𝗲𝗹𝗶𝗻𝗲𝘀 and (Jenkins, GitLab CI, etc.)
✔️ Exposure to 𝗰𝗹𝗼𝘂𝗱 𝗽𝗹𝗮𝘁𝗳𝗼𝗿𝗺𝘀 (AWS / Azure) and (Docker / Kubernetes)
✔️ S𝗲𝗰𝘂𝗿𝗶𝘁𝘆, 𝗻𝗲𝘁𝘄𝗼𝗿𝗸𝗶𝗻𝗴, 𝗮𝗻𝗱 𝗰𝗼𝗺𝗽𝗹𝗶𝗮𝗻𝗰𝗲 in Kafka environments
✔️ Collaborate with 𝗱𝗮𝘁𝗮 𝗲𝗻𝗴𝗶𝗻𝗲𝗲𝗿𝘀, 𝗱𝗲𝘃𝗲𝗹𝗼𝗽𝗲𝗿𝘀, 𝗤𝗔, 𝗮𝗻𝗱 𝗗𝗲𝘃𝗢𝗽𝘀 𝘁𝗲𝗮𝗺𝘀
✔️ P𝗿𝗼𝗱𝘂𝗰𝘁𝗶𝗼𝗻 𝗼𝗻-𝗰𝗮𝗹𝗹 𝗿𝗼𝘁𝗮𝘁𝗶𝗼𝗻𝘀 and critical incident management