Availability: Immediate to 30 days
Experience: 7+ Years in Banking / Financial Services
We're looking for a results-driven Data Project Manager to lead large-scale data initiatives for our banking client. This role requires experience with Databricks, Confluent Kafka, and strong project governance in a regulated environment.
Key Responsibilities:
▪️ Lead end-to-end delivery of data platform projects using Databricks and Confluent Kafka
▪️ Manage project planning, roadmaps, work breakdown, budgeting, and resource allocation
▪️ Oversee real-time streaming pipelines and analytics workloads
▪️ Align delivery with GDPR, CBUAE, BCBS 239 and other regulatory frameworks
▪️ Collaborate with cross-functional teams including engineers, architects, analysts & vendors
▪️ Track risks, issues, and compliance milestones using structured governance models
Must-Have Experience:
▪️ 7+ years in Project Management within banking/financial services
▪️ Hands-on project delivery using Databricks and Confluent Kafka
▪️ Deep understanding of data architecture, pipelines, and streaming technologies
▪️ Experience leading onshore/offshore teams
▪️ Working knowledge of Agile/Scrum and Waterfall methodologies
Technical Exposure To:
▪️ Databricks (Delta Lake, MLflow, Spark)
▪️ Confluent Kafka (Kafka Connect, kSQL, Schema Registry)
▪️ Azure or AWS (Azure preferred)
▪️ Informatica, Data Factory, and CI/CD pipelines
▪️ Oracle ERP Implementation
🎯 Preferred Qualifications:
▪️ PMP / Prince2 / Scrum Master certification
▪️ Familiarity with BCBS 239, GDPR, and CBUAE regulations
▪️ Knowledge of DAMA-DMBOK or similar data governance frameworks