Responsible for driving, designing and building scalable ETL systems for a big data warehouse to implement robust & trustworthy data to support high performing
ML algorithms, predictive models and support real-time data visualisation requirements across the organisation to enable self-help analytics.
Roles And Responsibilities
Systematic solution design of the ETL and data pipeline inline with business user specifications
Develop and implement ETL pipelines aligned to the approved solution design
Ensure data governance and data quality assurance standards are upheld
Deal with customers in a customer centric manner
Effective Self-Management and Team work
Minimum Requirements
3 year IT related degree
Post graduate qualification (advantageous)
5-10 years’ Experience and understanding in designing and developing data warehouses according to the Kimball methodology.
Adept at design and development of ETL processes.
SQL development experience, preferably SAS data studio and AWS experience
The ability to ingest/output CSV, JSON and other flat file types and any related data sources.
Proficient in Python or R or willingness
to learn. Experience within Retail, Financial Services and Logistics environments.