[CANDIDATES WHO REQUIRE WORK PASSES NEED NOT APPLY]
Job Requirement:
Experience: 2+ years’ relevant experience, minimum 4+ years of work experience
Possess with IT knowledge and System development lifecycle - Strong experience with Metadata Management, establish sourcing and access patterns for enterprise reference data.
Experience with SQL, SQL Server Integration Services (SSIS), Reporting Services and Analytic Services
Experience with Azure Data Lake, Azure Data factory
Familiar with Microsoft Azure data storage solutions, ingestion, computation services & APIs.
Understands the common data movement architectures (like ETL, ELT, etc.)
Experience with performing delta loads, auditing and error handling, Query performance and tuning
Experience with data extraction and manipulation, and ad-hoc query tools
Added advantages if knowledgeable about Python, U-SQL, SSIS Integration Runtime, HDInsights, Azure Data Bricks, DevOps
Role and responsibility:
Analyse complex, high-volume, high-dimensionality data from varying sources using a variety of ETL and data analysis techniques
Responsible for managing data sources, Data Ingest and Data Preparation processes including Metadata Management
Support in design of data models and to ensure they are in alignment with headquarter and global data models and scalable to other regions
Participate in Data Requirements Gathering and Analysis meetings with Team Members, Internal Customers, and Stakeholders
Develop, maintain, test, and troubleshoot data solutions, including Database Development, ETL / Data Migration Development, and Big Data Development
Understanding and support of Regional overseas Data Lake and analytics engine and platform
Assist with enabling solutions that are scalable and transferable for regional markets
Investigation and understanding data landscape for the respective markets
Ability to recognize/analyse highly complex processes, interdependencies and gaps and to develop new approaches to solutions
Supports and troubleshoots the data movement processes and the data warehouse environment
Promote synergies and reuse within and across projects and platforms in order to maximize rapid yet responsible delivery
Build-up and strengthen relationship with all stakeholders in the region, headquarter and the markets to effectively perform the necessary tasks within the function
Ability to explain complex processes clearly and precisely to different target groups and to explain the specific benefits of solution approaches
Experience with data extraction and manipulation, and ad-hoc query tools
Added advantges if knowledgeable about Python, U-SQL, SSIS Integration Runtime, HDInsights, Azure Data Bricks, DevOps Preferred Qualifications