MAIN RESPONSIBILITIES & ACCOUNTABILITIES
- Integrate data from various systems and platforms into the group’s data hub.
- Defines and builds data pipelines that will enable faster, better, data-informed decision-making within the business.
- Perform all needed data extraction, transformation and loading to populate the data hub.
- Work closely with data analysts and business end-users to implement and support cloud data platforms using best-of-breed technology and methodology.
- Write code/scripts to streamline and improve processes related to the flow of data and integration with other platforms.
- Ensuring optimal accuracy and timeliness of data on a real-time basis while implementing monitoring tools to detect data issues.
- Resolving data quality issues together with data owners.
- Comply with information security standards and governance processes.
- Identify new areas of improvement for data infrastructure with an eye to solve business problems.
- Handle day-to-day data hub operations and provide advisory to business users.
Requirements:
- Minimum Degree in IT or related field from recognized university.
- Relevant working experience in data engineering – ETL, data cleansing, data pipelines, data modelling and data integration.
- Working experience with following technologies:
o Microsoft Azure: Storage, Data Factory, Data Lakes, PaaS, Databricks, SQL Server, Synapse Analytics, Logic Apps.
o Languages: Python, Java/Scala, R, Rest APIs, T-SQL.
o Dashboards: Tableau.
- Working experience in building data pipelines in production and ability to work across structured, semi-structured and unstructured data
- Experience preparing data for analytics and following a data science workflow.
- Ability to work independently to deliver end to end solution and have good problem troubleshooting skill to solve production issues in timely and accurate manner.
- Good writing, communication and presentation skills.
- Self-motivated and independent learner.