Bachelor’s degree in computer science, Computer Engineering, IT, or related fields
§ 12+ years of overall experience with at least 5 years of leading teams delivering data lakes, data warehouses, data lakehouse
Minimum Experience
§ Data Engineering skills:
o Experience in deploying Lakehouse on Azure/AWS
o Proficiency in at least one of the programming languages viz. Python, Scala
o Proficiency in Data Integration tool – Informatica DEI or Big Data Management (BDM)
o Minimum 2 years’ experience on developing and deploying workloads on Apache Spark
o SQL skills are must
o Must have Cloud DevOps skills to be able to build and maintain the infrastructure as a code,
o At least 5 years of experience in Cloud, preferably Azure
§ Communication skills to be able to explain the technical scenarios to senior IT and business stakeholders.
Experience in agile way of working.