· Design, build, and maintain robust, scalable, and secure ETL/ELT pipelines to ingest structured and unstructured data from internal, external, and third‑party sources.
· Support the integration, population, and optimization of the DHA Data Lakehouse and data platforms in alignment with the approved enterprise data architecture.
· Ensure data pipelines support both batch and real‑time processing requirements, and are optimized for performance, reliability, and high availability.
· Collaborate closely with the Data Architect to ensure pipeline designs comply with architectural standards, integration patterns, APIs, and interoperability blueprints.
· Implement and embed data quality checks, validation rules, metadata capture, lineage tracking, and data classification workflows to support governance and trusted analytics.
· Monitor the performance, progress, and health of data pipelines and platforms, proactively identifying bottlenecks, failures, and optimization opportunities to improve efficiency and productivity.
· Enable reliable data provisioning for dashboards, AI/ML models, reporting solutions, and self‑service analytics tools used across DHA.
· Coordinate with business units, analysts, and data scientists to understand requirements and translate them into actionable data engineering tasks and delivery plans.
· Act as a liaison between DHA teams and vendors for data platform–related activities, ensuring alignment on priorities, timelines, and technical deliverables.
· Document technical designs, configurations, workflows, and building procedures, and support agile project delivery, onboarding of new data sources, and continuous platform improvements.