Data Migration Lead

Data Migration Lead

Our client, a leading global supplier for IT services, requires Data Migration Lead to be based at their client’s office in Redhill, UK.

This is a hybrid role – you can work remotely in the UK and attend the Redhill office 4 days per week .

This is a 6+ month temporary contract to start asap

Day rate: Competitive Market rate

You will be responsible for analysing legacy Oracle assets and delivering secure, accurate, and automated data migration pipelines as part of a broader modernisation programme. The role combines strong data engineering with AI-driven automation and quality assurance

Key Responsibilities

  • Analyse Oracle schemas, including procedures, packages, triggers, and dependencies
  • Perform source-to-target mapping and conduct impact analysis
  • Define and implement migration strategies (full load, incremental, logical, and physical)
  • Document transformation logic, business rules, and exception handling
  • Design, build, and maintain ETL/ELT pipelines using tools such as Data Pump, pgloader, SQL, Python/Bash, or cloud-based DMS solutions
  • Execute full and incremental data loads with accuracy and efficiency
  • Validate data quality, referential integrity, and reconciliation outcomes
  • Develop and maintain playbooks, runbooks, rollback plans, and technical documentation
  • Leverage AI tools to analyse PL/SQL logic and automate mapping and documentation processes
  • Apply AI/ML techniques for data profiling, anomaly detection, and automated data quality checks
  • Utilise LLM-based tools for SQL optimisation, code conversion, and metadata extraction
  • Build AI-powered dashboards and automate reporting for improved visibility and decision-making

Key Requirements

  • Legacy Analysis & Design
    • Analyse Oracle schemas, procedures, packages, triggers, and dependencies.
    • Perform source-to-target mapping and impact analysis.
    • Define migration strategy (full load, delta, logical, physical).
    • Document transformation logic and exceptions.
  • Migration Build & Execution
    • Build and maintain ETL/ELT pipelines using Data Pump, pgloader, SQL,
    • Python/Bash, or cloud DMS tools.
    • Execute full and incremental loads.
    • Validate data accuracy, referential integrity, and reconciliation results.
    • Maintain playbooks, runbooks, rollback plans, and technical documentation.

Due to the volume of applications received, unfortunately we cannot respond to everyone.

If you do not hear back from us within 7 days of sending your application, please assume that you have not been successful on this occasion.

Please do keep an eye on our website https://projectrecruit.com/jobs/ for future roles.

Upload your CV/resume or any other relevant file. Max. file size: 50 MB.

Project Global
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.