
Senior Data Architect
- United Kingdom
- Permanent
- Full-time
- Develop and maintain the data architecture roadmap, balancing legacy and modern data solutions.
- Evaluate emerging technologies (e.g., Apache Spark, Kafka) to future-proof our data landscape.
- Define and enforce data integration standards, ensuring consistency across systems.
- Oversee data flow, pipeline design, transformations, and warehouse architectures.
- Lead the technical implementation of SQL Server, Azure SQL, Databricks, and Airflow pipelines.
- Champion data mesh principles and federated analytics using Starburst (Trino).
- Enable real-time data streaming and analytics through Kafka.
- Design and implement data flows across Dynamics 365, Power Platform, and Dataverse.
- Utilize Python for complex transformations and integrations.
- Ensure seamless, secure, and scalable data exchange across platforms.
- Establish frameworks for data lineage, quality, and security.
- Define KPIs for data reliability, availability, and performance.
- Conduct code reviews, data modelling sessions, and performance tuning.
- Work with product managers, stakeholders, and technical teams to align data initiatives with business goals.
- Communicate strategies and trade-offs to both technical and non-technical audiences.
- Mentor and guide data engineers, ETL developers, and solution architects.
- Lead the transition from legacy ETL and data warehouses to modern architectures.
- Drive proof-of-concepts (PoCs) for new technologies in data engineering, streaming, and analytics.
- Identify opportunities for automation and cost optimization in data operations.
- Strategic Thinking: Balances innovation with practical risk management.
- Execution Focus: Delivers high-quality outcomes with persistence and efficiency.
- Change Leadership: Effectively drives adoption of new data technologies.
- Collaboration: Builds trust and alignment across teams.
- Decision-Making: Uses data-driven insights to inform technical and business decisions.
- Degree in Computer Science, Data Science, Engineering, or a related field.
- Preferred: TOGAF or equivalent enterprise architecture certification.
- Relevant Azure Cloud or Databricks certifications are a plus.
- ETL & Data Warehousing: Experience with legacy ETL tools and modern data transformation strategies.
- Microsoft Stack: Proficiency in SQL Server, Azure SQL, and Azure-based data processing.
- Apache Spark & Databricks: Strong background in large-scale data processing and analytics.
- Kafka & Streaming: Experience with real-time data ingestion and event-driven architectures.
- Python & Data Engineering: Hands-on experience building data pipelines and integrations.
- Data Governance & Security: Understanding of data privacy regulations (e.g., GDPR) and best practices.
- DevOps & Agile: Familiarity with CI/CD, Infrastructure as Code, and Agile methodologies.
- Occasional travel as required particularly to our office in Stapeley
- Demonstrated experience in project management, including resource planning and risk management, within Agile frameworks.
- Collaborate as One Team
- Create Value for Customers
- Innovate with Purpose
- Never stop improving
- Performance related bonus
- 25 days holidays + Bank Holidays
- NFU Pension (7.5% employer contribution)
- Private Health cover
- Health Cash back scheme
- Employee Assistance Program
- Employee share plan
- Remote working opportunity
- Flexible Working Policy (Where appropriate/practicable)
- Enhanced maternity leave 12 weeks at full pay followed by 4 weeks at 50% followed by SMP
- Comprehensive L&D program including career development programs, access to Genus University and Mango (languages)