Manager, Databricks Technical Architect, Strategy, Governance & Architecture, SAMA, Consulting
Deloitte
- Manchester London
- Permanent
- Full-time
- Architect Databricks Enterprise Data Platform solutions including ETL/ELT and Lakehouse architecture and design and lead use case delivery across a variety of use cases (e.g. legacy migrations, enterprise analytics, data engineering, single customer view, data science & AI).
- Work directly with client stakeholders to assess the as-is EDW and analytics workloads and suggest target solution architectures leveraging your expertise in Data & Analytics tools & Cloud technologies.
- Develop product roadmaps alongside clients, planning the delivery of features and user stories.
- Collaborate with Enterprise, Application, Data & DevOps Architects, Data engineers and scientists and Business teams to pilot use cases and discuss architectural design.
- Select appropriate technologies from a pool of open-source and commercial offerings, considering deployment models and integration with existing tools.
- Assist in the development of our market leading AI & Data offerings, in particular on Data Mesh architectures powered by Databricks solutions and produce proposal documents to showcase our capabilities.
- Develop thought leadership and industry eminence with point of views and insights gained from working with our clients and leading-edge technology.
- Contribute to client project delivery by owning one or more workstream(s) and ensuring commercially successful conclusion of the client engagements.
- Manage diverse teams within an inclusive team culture where people are recognised for their contribution.
- Develop the capability of junior team members through 'on the job' training and formal development programmes.
- Experience in Data architecture and/or data engineering.
- Hands on experience with other Data Warehouse platforms and migration experience from legacy to Databricks.
- Relevant experience in designing and delivery Databricks based solutions i.e. Databricks Lakehouse, data science workflows (MLFlow, MLOps on Databricks), governance (Unity Catalogue), security and access design on Cloud technologies (AWS, GCP, Azure).
- Experience with design and implementation in big data technologies such as Hadoop, NoSQL, MPP, OLTP, and OLAP or full lifecycle data science solutions.
- Experience with Data Acquisition, Integration & Transformation solutions leveraging Batch, Micro-batch, CDC and Event-driven.
- Broad knowledge across Cloud architecture, DevOps, Networking, Machine Learning, Security.
- Experience and knowledge of delivering innovative enterprise data management frameworks e.g. Data Mesh.
- Experience in working within an architecture function and presenting architectural designs to a variety of stakeholders (incl. Technical Design Authorities or Architecture Boards).
- Ability to demonstrate stakeholder management skills and collaborate effectively with multidisciplinary teams.
- Experience of leading multi-disciplinary teams in fast-paced project environments and demonstrate personal resilience.
- Demonstrate the ability to conduct discovery and design workshops, creation of architectural design and implementation roadmaps and managing workstream delivery.
- Ability to support 'go-to-market' activities such as responding to RFI / RFPs and developing high-quality proposal materials.
- Programming knowledge in SQL, Python, R or Scala.
- Professional certifications in Databricks and Cloud platforms.