Principal Data Engineer
Xiatech
- Fitzrovia, Central London
- Permanent
- Full-time
- Real-time, single view of data
- Real-time, actionable insights, including predictive analytics
- Provide technical leadership to a team of data engineers, fostering a culture of collaboration, innovation, and excellence.
- Define the data engineering strategy & roadmap, including breaking this down into epics and stories for the squads you work with.
- Design and implement scalable and efficient data architectures that support the business objectives.
- Work with Product Owners to prioritise and orchestrate work between squads and engineers, including setting non-functional requirements.
- Drive change and continuously improve the company’s data engineering practices & maturity.
- Provide governance through the technical design authorities.
- Evaluate and select tools and technologies that align with the organisation's strategy.
- Set and track technical metrics to assess the performance of the team and the solutions they build.
- Help the organisation scale by improving our onboarding, training, and SDLC processes.
- Hands-on building of our next-generation XFuze platform using agile methodologies.
- Data Quality engineering, implementing patterns that clean and enrich data in reliable ways.
- Mentor and improve the technical skills of other team members through peer reviews.
- Work with clients to analyse requirements and implement solutions.
- 7+ years experience in Data Engineering/Architecture (or similar).
- Broad knowledge of different technical areas (DevOps, Engineering, Architecture, Various Datastores, etc).
- Strong proficiency in SQL & Python.
- Experience working with Spark (Pyspark), data streaming, and big data solutions. Experience with Databricks is a bonus.
- Designing and building data warehouses & data lakes, including data modeling and mapping using techniques such as medallion and star schema (Experience with BigQuery a bonus).
- Data pipeline/ETL Development, building ingestion pipelines that produce scalable data products as outputs for analytics.
- Orchestration frameworks such as Dataform, Airflow, and GCP Workflows.
- Cloud platforms, especially GCP services such as GCS, Cloud Functions, and Dataproc.
- Familiarity with containerization and IaC tools (e.g. Docker, Kubernetes, Terraform).
- Worked within an agile environment (Scrum, Kanban, etc).
- Previous experience in the retail domain is highly desirable.
- Excellent communication and presentation skills.
- Innovative
- Driven
- Resilient
- Insightful
- Empathetic
- Analytical
- Self-Starter
- Creative
- Problem solver