
AWS Data Architect
- Northwich, Cheshire
- Permanent
- Full-time
- Deliver individual consultancy engagements or contribute to larger projects by gathering requirements, analysing data, and proposing cloud-native solutions
- Technically manage the assessment, design, and implementation of solutions
- Ensure all consultancy work is delivered with high quality, consistency, and in line with best practices.
- Highlight technical risks so that any Ingram Micro exposure to commercial loss can be minimised
- Design and implement secure, high-performance AWS data platforms aligned to business and technical needs.
- Build data models, ETL/ELT pipelines, and integration solutions using modern tools and frameworks.
- Align data architectures with customer objectives, AWS well-architected principles, and industry standards.
- Lead end-to-end data engineering initiatives, from discovery through to production deployment.
- Collaborate with cross-functional teams to ensure successful project delivery.
- Create and maintain documentation covering data architectures, technical processes, and standards.
- Ensure smooth knowledge transfer and operational handover to support and delivery teams.
- Provide expert-level support to internal and external stakeholders as required.
- Lead technical workshops and presentations with customers to shape solutions and influence outcomes.
- AWS Data Engineer Associate Certification
- AWS Machine Learning Speciality Certification
- AWS Solutions Architect Professional Certification
- Experience in data architecture/engineering, designing cloud-native data platforms.
- Experience in delivering complex AWS solutions, including data lakes and pipelines.
- Expert in AWS services like S3, Glue, Redshift, EMR, Lambda, and RDS.
- Strong grasp of data modelling, including OLAP/OLTP and schema design.
- Built ETL/ELT workflows with Glue, dbt, and orchestration tools (e.g., Airflow).
- Proficient in Python, SQL, PySpark, and distributed processing with Spark.
- Familiar with streaming tech such as Kafka or Kinesis.
- Ability to Optimise data solutions for speed, scale, security, and cost.
- Knowledge of MLOps pipelines for training and inference (e.g., SageMaker).
- Familiar with ML architectures and foundation model developments