Data Engineer

🔒 Confidential Employer
Posted 22 March 2026
LOCATION
London
TYPE
Full-time
LEVEL
Entry-level
CATEGORY
Technology
This employer holds a UK Home Office sponsor license — sponsorship for this specific role is at the employer’s discretion

SKILLS

Python ETL Pipelines S3 Kafka Airflow Data Engineering

FULL DESCRIPTION

[Employer hidden — view at passion-project.co.uk] leverages quantitative analysis and cutting-edge technology to identify and capitalize on opportunities across global financial markets. We foster a collaborative and intellectually stimulating environment, bringing together individuals with Mathematics, Physics and Computer Science backgrounds who are passionate about applying rigorous scientific methods to financial challenges. As a fundamentally data-driven business, our success is heavily linked to the acquisition, processing, and analysis of vast datasets. High-quality, well-managed data forms the critical foundation for our quantitative research, strategy development, and automated trading systems.

As a Data Engineer within our Quantitative Platform team, you will play a pivotal role in building and maintaining the data infrastructure that fuels our research and trading strategies. You will be responsible for the end-to-end lifecycle of diverse datasets – including market, fundamental, and alternative sources – ensuring their timely acquisition, rigorous cleaning and validation, efficient storage, and reliable delivery through robust data pipelines. Working closely with quantitative researchers and technologists, you will tackle complex challenges in data quality, normalization, and accessibility, ultimately providing the high-fidelity, readily available data essential for developing and executing sophisticated investment models in a fast-paced environment.

Your responsibilities will include:

  • Evaluating, onboarding, and integrating complex data products from diverse vendors, serving as a key technical liaison to ensure data feeds meet our stringent requirements for research and live trading.
  • Designing, implementing, and optimizing robust, production-grade data pipelines to transform raw vendor data into analysis-ready datasets, adhering to software engineering best practices and ensuring seamless consumption by our automated trading systems.
  • Engineering and maintaining sophisticated automated validation frameworks to guarantee the accuracy, timeliness, and integrity of all datasets, directly upholding the quality standards essential for the efficacy of our quantitative strategies.
  • Providing expert operational support for our data pipelines, rapidly diagnosing and resolving critical issues to ensure the uninterrupted flow of high-availability data powering our daily trading activities.
  • Participating actively in team rotations, including on-call schedules, to provide essential coverage and maintain the resilience of our data systems outside of standard business hours.

What we are looking for:

  • 1+ years’ experience building ETL/ELT pipelines using Python
  • Familiarity with various technologies such as S3, Kafka, Airflow, Iceberg.
  • A commitment to engineering excellence and pragmatic technology solutions.
  • A desire to work in an operational role at the heart of a dynamic data-centric enterprise.
  • Excellent communication and collaboration skills, and the ability to work in a team.

What would be advantageous:

  • Strong understanding of financial markets.
  • Proficiency working with large financial datasets from various vendors.
  • Experience working with hierarchical reference data models.
  • Proven expertise in handling high-throughput, real-time market data streams
  • Familiarity with distributed computing frameworks such as Apache Spark
  • Operational experience supporting real time system
Sign up free — access 45,000+ UK sponsor-licensed jobs