Data Architect

Loop Recruitment logo
Loop Recruitment
Screened
London
£650/day
Posted 2 days ago
Apply Now

About the role

Data Architect — Insurance Transformation | Contract | Remote



Skills, Experience, Qualifications, If you have the right match for this opportunity, then make sure to apply today.

Location: Remote‑first (ad hoc London / client visits as required)

Rate: up to £650/day (Outside IR35)

Length: Discovery (short‑term) progressing into 12+ month implementation


Sector: Insurance | Pensions | Data Transformation


Overview

We have an urgent requirement for a contract Data Architect to lead a data transformation for a third‑party administrator. The initial engagement is discovery into a long implementation. Strong insurance experience, ideally with bulk purchase annuities (BPA), is essential.


Role


You’ll define the target data architecture and lead the technical design from legacy mainframe/COBOL sources into a cloud lakehouse (medallion) pattern. This is a hands‑on, senior technical role collaborating with actuarial, product, engineering and platform teams to ensure the architecture supports regulatory reporting, client reporting and self‑service portals.


Key responsibilities

  • Lead discovery and design of a lakehouse/medallion data architecture that ingests and refactors mainframe/COBOL datasets
  • Architect microservices-based integration patterns and APIs for data movement and portal access
  • Translate actuarial and scheme data requirements into robust data models and pipelines
  • Define data quality, reconciliation and control frameworks to support scheme adjustments and regulatory reporting
  • Design deployment and CI/CD patterns for data pipelines across Dev/UAT/Prod environments
  • Produce technical artefacts: data models, ingestion patterns, security controls, runbooks and migration plans
  • Work with Product, Platform and Actuarial teams to ensure operational readiness and handover to run teams


What we’re looking for

xwzovoh
  • Strong insurance or pensions experience, ideally with bulk purchase annuities / scheme data familiarity
  • Deep expertise in cloud data platforms, lakehouse/medallion architectures and data engineering patterns
  • Proven experience migrating mainframe/COBOL data to modern cloud platforms and integrating via microservices/APIs
  • Good understanding of actuarial data models and how they impact pricing, reporting and scheme adjustments
  • Hands‑on with data tooling (e.g., Databricks, Azure/AWS/GCP, Spark, SQL, Delta Lake) and CI/CD for data
  • Strong stakeholder engagement skills and experience working across technical and non‑technical teams

About this listing

Screened by Joboru

This role passed our automated spam and quality filters and was active in our feed when last checked. Joboru is an aggregator — here is how we screen listings. If anything looks off, tell us.