
Senior Data Engineer - Databricks (6 Month Contractor)
- London
- Contract
- Full-time
- Design and implement scalable data architectures using Databricks Unity Catalog, Delta Lake, and Apache Spark
- Develop and maintain complex ETL/ELT pipelines processing terabytes of data daily
- Architect medallion architecture (Bronze, Silver, Gold) data lakehouses with optimal performance patterns
- Implement real-time and batch data processing solutions using Structured Streaming and Delta Live Tables
- Design data mesh architectures with proper data governance and lineage tracking
- Optimize Spark jobs for cost efficiency and performance, including cluster auto-scaling and resource management
- Implement advanced Delta Lake features including time travel, vacuum operations, and Z-ordering
- Build robust data quality frameworks with automated testing and monitoring
- Design and implement CI/CD pipelines for data workflows using Databricks Asset Bundles or similar tools
- Develop custom solutions using Databricks APIs and integrate with external systems
- A Bachelor's degree in computer science, or a related field.
- 10+ years of experience in data engineering, including designing and deploying production-quality data pipelines and ETL/ELT solutions.
- A Databricks Certified Professional Data Engineer certification
- Proficiency in Python, PySpark and SQL.
- Strong hands-on experience with cloud services such as AWS (Glue, Lambda, Redshift, S3).
- Experience with data governance and management platforms (e.g., AWS Sagemaker Unified Studio and Unity Catalog.
- Experience with data migration projects and legacy system modernization
- Familiarity with containerization and orchestration technologies (Docker, Kubernetes) is a plus.