Creating ethical AI starts with responsible development. Discover strategies to address AI’s biggest ethical challenges.
Get the whitepaper now.

Data Engineer

London, UK
£65 000 - £70 0000
Permanent
Apply Now

Data Engineer

Location: UK, with occasional travel to London required

The Vacancy

We are seeking a skilled Data Engineer to join the Data team at our client in the Insurance industry. This role is crucial to driving the success of our Data Strategy, ensuring that their data infrastructure is robust, reliable, and scalable to support company's rapid growth and commitment to excellence. 

How you will make an impact 

As a Data Engineer, you will maintain and update the data pipelines and data warehouse, supporting Insight and Analytics with table creation, business logic, and stakeholder management. Working closely with the Head of Data, you will play a crucial role in enhancing data quality and building trust in the data platform, ensuring company’s data infrastructure is future ready. 

Key Responsibilities 

  • Data Pipeline Development: Design, build, and maintain scalable ETL/ELT pipelines within Microsoft Fabric using tools like Azure Data Factory, Dataflows, Power Query, and Synapse Pipelines. 
  • Data Modelling: Create high performance data models in Synapse and Power BI for effective reporting and analytics. 
  • Data Integration: Consolidate diverse data sources into a unified data platform, applying Lakehouse architecture principles. 
  • Data Transformation: Develop complex data transformations using SQL, PySpark, and other tools within Microsoft Fabric. 
  • Collaboration: Partner with data analysts and business stakeholders to deliver technical solutions aligned with data requirements. 
  • Performance Optimisation: Monitor and optimize data pipelines, queries, and storage layers for performance and cost efficiency. 
  • Security and Compliance: Implement best practices for data security and ensure compliance with policies and regulations such as GDPR and HIPAA. 

Experience and Qualifications 

  • Proficiency in Azure Data Factory, Power Query, Dataflows, and Synapse Pipelines. 
  • Advanced understanding of data warehousing (e.g., star schema) and experience with data model design in Synapse and Power BI. 
  • Skilled in SQL, Python, and PySpark for data transformation and querying. 
  • Extensive ETL/ELT experience, including batch and streaming processes in Fabric tools. 
  • Proven ability to optimise data pipelines, ETL jobs, and queries for efficiency and cost effectiveness. 
  • Preferred knowledge of Azure Data Lake, Azure SQL Database, Synapse SQL pools, and Delta Lake. 
  • Industry experience in insurance, particularly broking (preferred). 
  • Bachelor's or master's degree in computer science, Information Technology, or a related field (preferred). 
APPLY FOR THIS ROLE