Senior Data Engineer

🔒 Confidential Employer
Posted 23 June 2025
LOCATION
Reading
TYPE
Full-time
LEVEL
Mid-Senior level
CATEGORY
Technology
This employer holds a UK Home Office sponsor license — sponsorship for this specific role is at the employer’s discretion

SKILLS

Python SQL Snowflake ETL Data Pipelines Cloud Technologies Data Architecture Data Management

FULL DESCRIPTION

Summary

We are currently looking for a Senior Data Engineer to develop cloud-based data management solutions. Key responsibilities include designing data models, managing data pipelines, and working with various cloud technologies and BI tools. Required skills include Python, SQL, Snowflake, ETL, and data architecture. This is a full-time position in Reading, UK.

  • Develop architectural models for Cloud-based data management solutions leveraging Microsoft Azure / AWS / GCP / Snowflake technologies.
  • Design conceptual, logical, physical data models.
  • Manage the data engineering roadmap and help to bring the organization towards an automated, scalable and fault-tolerant infrastructure.
  • Build data management platforms using Cloud Technologies.
  • Design, build, and maintain efficient, reusable, and reliable Python code.
  • Apply advanced analytics, data mining and statistical techniques.

Location: Reading, Berkshire, RG2 6UB, United Kingdom

Duties and Responsibilities:

Responsible for the functional design requirements for a cloud-based Data Management solution and design conceptual, logical, physical data models that can meet current and future business needs. Provides Cloud and Data Management environments, able to deep dive and identify root cause of issues. Evaluate and Plan DWH Migrations to Cloud. Manage the data engineering roadmap and help to bring the organization towards an automated, scalable and fault-tolerant infrastructure Advance writing and building data pipelines, lakes, managing ETL processes and perform a number of transformations. Understanding relational and big data models to both store and access data from data visualization and other query tools. Build data management platforms using Cloud Technologies like Data factory, Cosmos SQL, Blob, Redshift, Lambda, RDS, S3, EC2, Kinesis, AWS/Azure/Snowflake Data Warehouse and other services. Building integrations between applications using REST APIs Design, build, and maintain efficient, reusable, and reliable Python code • Ensure the best possible performance, quality, and responsiveness of applications • Apply advanced analytics, data mining and statistical techniques and using a diverse set of tools to bring insights out of complex data. • Collaborate with developers to test and verify that solutions will meet the business requirements.

Required Skills and Experience

Experience in Data Architecture, Data Management and Analytical Technologies of modern data platforms on Cloud. Experience in leveraging Snowflake for building data lakes / data warehouses is highly preferred. Excellent SQL Skills along with ETL/data processing tools such as Informatica, Talend, Pentaho, Databricks, Spark, Alteryx. Experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.

• Experience in Programming Languages such as Python and Java.

• Experience with Python Development specially cx_Oracle,NumPy, Psycopg2

• Should be strong in Python frameworks like Django, Flask or Pyramid.

• Experience in using Python Libraries for Data mining, Data Modeling and Processing and Data Visualizations

• Experience in SQL / NO SQL Databases such as Oracle, MS SQL Server, Cassandra, Mongo DB, HBase, Zen, Elastic Stack, Couch DB, Dynamo DB and others. Some familiarity with front end development (AngularJS, ReactJS, JavaScript/jQuery, Ajax, CSS).

• Experience in Data Pipelines such as Kafka or Apache Airflow Experience in Big Data Ecosystem (Apache Hadoop, Spark, Kafka) and/or the IaaS or PaaS Ecosystem (Microsoft Azure, Google Cloud, AWS). Should be strong in Data factory, Cosmos SQL, Blob, Redshift, Lambda, RDS, S3, EC2, Kinesis, AWS/Azure/Snowflake Data Warehouse and other services.

• Experience in BI tools such as Tableau, Qlik and Looker.

• Experience in Algo Workflows and MIFID II regulatory reporting.

• Experience in Continuous Integration and Delivery.

• Experience with JIRA/Confluence and source control environments like Git, GitHub etc.

• Knowledge on FX connect foreign exchange platform, Power BI and Google Studio

• Experience with software development in a Windows, Linux/Unix environment.

. Excellent written and verbal communication skills. Proven ability to interact, evangelize, present and influence at all levels from C-suite to engineers Salary: Competitive Salary Offered.

Hours: 40 Hours per Week, Monday to Friday

Location: Reading, Berkshire, RG2 6UB, United Kingdom No Agencies please.

Sign up free — access 45,000+ UK sponsor-licensed jobs