Description:
Involved in the design, build and integration of data from various sources and the management of the existing Data Ingestion solutions in SQL Server and Azure environments. Also managing the delivery of multiple projects related to the Microsoft Distribution Data Ingestion (SSIS and Azure) on a day-to-day basis, within agreed constraints and the specified tolerances of scope, time, cost and quality.
Key responsibilities:
- understanding business requirements relating to business data sources with the ability to integrate them and/ or transform it effectively in accordance with business needs
- the customisation and development of Extract Transform and Load routines, modelling, queries, reports and analysis
- the support of existing data marts, warehouse process chains and associated models and queries,
- Facilitating the follow up meetings and removing obstacles that affect the team
- ensuring the successful delivery of all assigned Data Engineering work on projects. across all business units and their successful deployment to production, including use cases developed by our Data Science & Analytics team,
- Implement changes (CR’s) or new developments relating to existing data warehouses
- tracking project status and managing the project teams to mitigate issues and risks
- troubleshooting and resolving issues with any of our SQL Server and Azure data repositories
Key Skills : Significant experience in Software Development and Agile Methodologies.
- Excellent experience in developing ETL routines, processes and structures.
- Expert in SSIS, Power Automate, Azure Data Factory and Azure Synapse ETL development.
- Significant experience in developing Python or PySpark.
- Proven ability to set up Azure Synapse connectors (On-prem file shares, SFTP, SQL database, HTTP, etc.) in a real-world enterprise environment.
- Significant experience in Database and Data Storage Platforms, including Azure Data Lake.
- Excellent experience in developing ETL routines, processes and structures.
- Extensive knowledge of data modelling techniques and structures.
- Practical experience of developing ETLs for Dimensions and Fact Tables, including SCD I and II implementation.
- Excellent experience of deploying solutions in a fast-moving enterprise.
- Ability to validate engineering data and deploy solutions. Experience in incident, change management and transition processes
- Creating data catalogues
- Identifying Critical Data Elements
- Recording data quality business rules
- Carrying our data quality assessments
- Creating data quality assessment scripts & ETLs
- Creating data quality reports
- Designing & implementing Conceptual, Logical, and Physical data models
- Designing & implementing Relational data models with normalisation
- Designing & implementing Dimensional Data Models (Dimensions, Facts, SCD I, SCD II, SCD III)
- Designing & implementing Lakehouse and Delta Lake architectures
- Reporting, documentation/writing.
- Excellent Organisational skills.
- Experience with a wide range of IT environments, platforms and applications
- IT Qualification - Degree or equivalent work experience
- Professional certifications e.g. MCSE