Principle DataOps Architect

🔒 Confidential Employer
Posted 7 May 2026
LOCATION
London
TYPE
Full-time
LEVEL
Director
CATEGORY
Data & Analytics
This employer holds a UK Home Office sponsor license — sponsorship for this specific role is at the employer’s discretion

SKILLS

DataOps CI/CD Databricks Python SQL Terraform Infrastructure as Code Data quality

FULL DESCRIPTION

Principle DataOps Architect

Company: [Employer hidden — sign up to reveal]

Location: London, GB

Employment Type: Full-Time

Category: Information Technology

About [Employer hidden — sign up to reveal]

[Employer hidden — sign up to reveal] Group plc (NASDAQ: MRX) is a diversified global financial services platform providing essential liquidity, market access and infrastructure services to clients across energy, commodities and financial markets. The group provides comprehensive breadth and depth of coverage across four core services: clearing, agency and execution, market making, and hedging and investment solutions. It has a leading franchise in many major metals, energy and agricultural products, with access to 60 exchanges. The group provides access to the world’s major commodity markets, covering a broad range of clients that include some of the largest commodity producers, consumers and traders, banks, hedge funds and asset managers. With more than 40 offices worldwide, the group has over 3,000 employees across Europe, Asia and the Americas.

Department Description

[Employer hidden — sign up to reveal] has unique access across markets with significant share globally both on and off exchange. The depth of knowledge amongst its teams and divisions provides its customers with clear advantage, and its technology-led service provides access to all major exchanges, order-flow management via screen, voice and DMA, plus award-winning data, insights and analytics. The Technology Department delivers secure, scalable digital tools, software services, and infrastructure across all business units. The Data & AI team drives productivity, compliance, and insights by leveraging a robust Data Lakehouse.

Role Summary

We are seeking a Principle DataOps Architect to own and embed a DataOps way of working across the enterprise data organisation. This is a greenfield role with a clear organisational mandate and significant scope to define how DataOps frameworks, tooling, and engineering standards are applied across our data platforms. The role is accountable for enabling the safe, reliable, and frequent delivery of production grade data capabilities, supporting real-time management information, critical systems integration, advanced analytical and ML model refreshes, and accurate, timely data for customer facing AI solutions.

Responsibilities

  • Design, implement, and operationalise the DataOps target state through consistent frameworks, tooling, and standards.
  • Build and maintain CI/CD pipelines for data and ML workloads.
  • Enforce separation of duties between engineering and production administration.
  • Define and implement test‑driven development practices for data pipelines.
  • Embed data reliability patterns aligned to the Medallion Architecture.
  • Implement data quality enforcement, monitoring, and observability frameworks.
  • Apply software engineering and SDLC best practices consistently across data and ML workloads.
  • Provide hands‑on technical leadership and design advisory services.
  • Build and maintain reusable DataOps toolkits, templates, and reference architectures.
  • Partner with Governance, Risk, and Control stakeholders.
  • Support production readiness, release governance, and post‑incident learning.
  • Champion DataOps, analytics, and engineering best practices.

Skills and Experience

Essential:

  • Strong experience implementing DataOps or DevOps practices in complex data environments.
  • Proven expertise designing and operating CI/CD pipelines for data and analytics workloads.
  • Hands‑on experience with Databricks (AWS and/or Azure) in production environments.
  • Strong proficiency in Python and SQL.
  • Experience applying Infrastructure as Code (IaC) using tools such as Terraform.
  • Deep understanding of data platform reliability, observability, and quality controls.
  • Strong knowledge of software engineering best practices and SDLC.
  • Ability to operate as both hands‑on engineer and strategic enabler.
  • Experience working in a regulated environment.

Desirable:

  • Experience in financial services, with exposure to ETD and OTC derivative markets.
  • Hands‑on experience with: Orchestration Platforms (Azure Data Factory, Apache Airflow), Databricks Asset Bundles, Power BI, Data Transformation Platforms (dbt, Databricks Lakeflow).
  • Experience with Azure DevOps (ADO), Bitbucket, and Git‑based workflows.
  • Advanced use of VS Code and developer productivity tooling (e.g. GitHub CoPilot).
  • Experience supporting ML pipelines and model lifecycle operations.
  • Prior involvement in building or scaling enterprise data platforms.
  • Databricks Certified Data Engineer Professional.
  • DAMA Certified Data Management Professional (CDMP).

About Working at [Employer hidden — sign up to reveal]

[Employer hidden — sign up to reveal] is fully committed to being an inclusive employer and providing an inclusive and accessible recruitment process for all. We value the differences that a diverse workforce brings. We welcome applications from candidates returning to the workforce. [Employer hidden — sign up to reveal] is also committed to avoiding circumstances in which the appearance or possibility of conflicts of interest may exist within the hiring process.

Apply to this position

Sign up free — access 45,000+ UK sponsor-licensed jobs