Take on the responsibility for driving, designing and building scalable ETL systems for a big data warehouse to implement robust & trustworthy data to support high performing ML algorithms, predictive models and support real-time data visualisation requirements across the organisation to enable self-help analytics.
Systematic solution design of the ETL and data pipeline inline with business user specifications
Develop and implement ETL pipelines aligned to the approved solution design
Ensure data governance and data quality assurance standards are upheld
What experience will you need to succeed in this role?
5-10 years’ Experience and understanding in designing and developing data warehouses according to the Kimball methodology.
Adept at design and development of ETL processes. SQL development experience, preferably SAS data studio and AWS experience
The ability to ingest/output CSV, JSON and other flat file types and any related data sources.
Proficient in Python or R or willingness to learn.
Experience within Retail, Financial Services and Logistics environments.