Data Engineer (Azure) – Contract
Role Overview
We are seeking an experienced Azure Data Engineer - Contractor to design, build, and optimise scalable, enterprise-grade data platforms. The successful candidate will have deep expertise in Azure data services, advanced SQL skills, and strong experience implementing modern data architectures including Medallion (Bronze/Silver/Gold).
This role requires hands-on development experience across Azure Data Factory, Databricks, Synapse, and SQL-based data platforms, alongside strong DevOps and CI/CD capabilities.
Key Responsibilities
- Design and implement end-to-end data pipelines using Azure Data Factory, Azure Databricks, Synapse Analytics, and SQL.
- Architect and implement Medallion data architecture (Bronze, Silver, Gold layers) for scalable and maintainable data platforms.
- Develop robust data transformation workflows using SQL, PySpark, and Python.
- Build and optimise enterprise-grade data models for analytics and reporting.
- Optimise SQL performance across Azure SQL, Synapse, and cloud-based data warehouses.
- Deliver production-grade Databricks solutions including Delta Lake implementations.
- Lead data migration and modernisation projects to Azure Data Lake Storage and Blob Storage.
- Develop CI/CD pipelines using Azure DevOps and GitHub Actions for data workloads.
- Ensure data governance, quality, monitoring, and performance optimisation across platforms.
- Support Power BI semantic models and reporting datasets where required.
- Collaborate with stakeholders to translate business requirements into scalable data solutions.
Required Technical Skills
- Azure Data Factory (ADF), Azure Databricks (PySpark, Delta Lake)
- Azure Synapse Analytics, Azure Data Lake Storage (ADLS Gen2)
- Azure SQL Database / Managed Instance, Azure Blob Storage, Azure DevOps (CI/CD for data)
Certifications (Highly Desirable)
- AZ-305 – Designing Microsoft Azure Infrastructure Solutions
- AZ-400 – Designing and Implementing Microsoft DevOps Solutions
- DP-203 – Data Engineering on Microsoft Azure
- DP-300 – Administering Microsoft Azure SQL Solutions
Data Engineering & Architecture
- Deep expertise in SQL (MS SQL, Synapse, performance tuning, indexing, query optimisation)
- Strong experience with Azure Databricks & Spark
- Hands-on implementation of Medallion architecture
- Data modelling (star schema, dimensional modelling, data vault desirable)
- ETL/ELT design patterns
- Delta Lake architecture & optimisation
Programming & Tools
- Python
- PySpark
- dbt (desirable)
- GitHub
- PowerShell (automation)
- REST APIs / GraphQL (integration)
Cloud Platforms
- Microsoft Azure (expert level)
Experience Required
- Proven experience delivering end-to-end Azure data platform implementations
- Strong background in data migration and modernisation projects
- Experience building scalable ML data pipelines (MLOps exposure desirable)
- Demonstrated experience integrating enterprise systems (e.g., ERP, CRM) into cloud data platforms
- Experience working in Agile delivery environments
- Strong stakeholder engagement and requirement translation capability
Desirable Experience
- Customer analytics / retention modelling
- Financial data modelling
- Power BI semantic modelling (DAX)
- Experience designing cost-optimised Azure data solutions
Contract Details
- Contract role 3–12-month engagement
- Hybrid / Remote based in Cape Town or JHB
- Competitive day rate depending on experience
- Immediate or short notice availability preferred