- Permanent role |Career Advancement Opportunity
About Our Client
Our client is a leading brand in the IT Solutions industry, they are looking for a
Data Engineer that has experience building data pipelines to perform ETL across data warehouse such as Snowflake.
Job Description
- Bachelor Degree in Computer Science, Computer Engineering or equivalent.
- At least 5 years experience of working as a Data engineer in a Big data field.
- Solid working knowledge of implementing ETL pipelines using Informatica BDM (DEI) on data warehouses and big data platforms, such as RDBMS, Snowflake.
- Hands on experience with application integration with RDBMS such as Oracle, MS-SQL, MySQL. Working knowledge of Oracle and MS-SQL will be an added advantage
- Exposure and knowledge in the following technologies is advantageous:
- Programming and Scripting: Python, Shell Script
The Successful Applicant
- Experienced with the Systems Development Life Cycle implementation methodology (SDLC) or agile methodologies like Scrum and Kanban.
- workflow and code documentation.
- Strong communication and people skills required to interact with internal and external stakeholders - data analysts, business end-users and vendors to design and develop solutions.
- Good at working with details and is meticulous for operations.
What's on Offer
You will be part of an organisation that sees value in investing in its employees. Stability in your career is a key for them; hence you'll be a great fit if this is one of the key values to you. The remuneration for this role will be competitive and in line with the market.
Contact: Syairah Banu
Quote job ref: JN-032023-5977253