
Cloud Data Analytics Platform Engineer - VP
- London
- Permanent
- Full-time
- Architect and Build: Design and implement a robust, cloud-native data analytics platform spanning AWS, GCP, and other emerging cloud environments. You'll leverage services like S3/GCS, Glue, BigQuery, Pub/Sub, SQS/SNS, MWAA/Composer, and more to create a seamless data experience. (Required)
- Data Lake , Data Zone, Data Governance: Design, build, and manage data lakes and data zones within our cloud environment, ensuring data quality, discoverability, and accessibility for various downstream consumers. Implement and maintain enterprise-grade data governance capabilities, integrating with data catalogs and lineage tracking tools to ensure data quality, security, and compliance. (Required)
- Infrastructure as Code (IaC): Champion IaC using Terraform, and preferably other tools like Harness, Tekton, or Lightspeed, developing modular patterns and establishing CI/CD pipelines to automate infrastructure management and ensure consistency across our environments. (Required, with expanded toolset)
- Collaboration and Best Practices: Work closely with data engineering, information security, and platform teams to define and enforce best practices for data infrastructure, fostering a culture of collaboration and knowledge sharing. (Required)
- Kubernetes and Orchestration: Manage and optimize Kubernetes clusters, specifically for running critical data processing workloads using Spark and Airflow. (Required)
- Cloud Security: Implement and maintain robust security measures, including cloud networking, IAM, encryption, data isolation, and secure service communication (VPC peering, PrivateLink, PSC/PSA). (Required) . Your knowledge of compliance frameworks relevant to cloud data will be invaluable in maintaining a secure and compliant data environment. (Optional)
- Snowflake and Databricks (Optional, but highly desired): Leverage your experience with Snowflake and Databricks to enhance our data platform's capabilities and performance. While not mandatory, experience with these technologies is a significant advantage.
- Event-Driven Architectures , FinOps and Cost Optimization (Optional): Contribute to the development of event-driven data pipelines using Kafka and schema registries, enabling real-time data insights and responsiveness. Apply FinOps principles and multi-cloud cost optimization techniques to ensure efficient resource utilization and cost control.
- Hands-on Engineering Expertise: You're a builder who enjoys diving into the technical details and getting your hands dirty. You thrive in a fast-paced environment and are eager to make a direct impact.
- Experience : 8-13 years of relevant experience in Data Engineering & Infrastructure Automation
- Cloud Expertise: Proven hands-on experience with AWS and/or GCP, including a deep understanding of their data analytics service offerings.
- Data Lake/Zone/Governance Experience: Demonstrable experience designing, building, and managing data lakes and data zones. Familiarity with data governance tools and frameworks.
- IaC Proficiency: Solid experience with Terraform and preferably Harness, Tekton, or Lightspeed for CI/CD pipeline management.
- Kubernetes Mastery: Strong command of Kubernetes, especially in the context of data processing workloads.
- Security Focus: A firm grasp of cloud security principles and best practices.
- Financial Services Experience: Experience working in financial services, banking, or on data-related cloud transformation projects within the financial industry. (Highly Desired)