Responsibilities
Work closely with different business stakeholders to maintain and develop a centralised data warehouse for database management.
Responsible for ETL programming development, data warehouse maintenance, data integration projects and etc
Oversee and manage the workflow of data quality management, data cleansing and data exchanges processes by designing and implementing different rules and best practice
Full lifecycle implementation from requirements analysis, platform selection, technical architecture design, application design and development, testing, and deployment.
Basic Qualifications
At least 2 years of experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions
Hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, HDInsight, Databricks Azure Data Catalog, Cosmo Db, ML Studio, AI/ML, etc.
Minimum of 2 years of hands-on experience in Azure and Big Data technologies such as Powershell, C#, Java, Node.js, Python, SQL, ADLS/Blob, Spark/SparkSQL, Databricks and streaming technologies such as Kafka, EventHub etc.
Well versed in DevSecOps and CI/CD deployments
Cloud migration methodologies and processes including tools like Azure Data Factory, Data Migration Service, SSIS, Attunity (Qlik), Event Hub, Kafka, etc.
Minimum of 2 years of RDBMS ornoSQL (e.g. MongoDB, Neo4J) experience
Experience in using Big Data File Formats and compression techniques
Experience working with Developer tools such as Git, Azure DevOps, Visual Studio Team Server etc.
Bachelors or higher degree in Computer Science or a related discipline.
Nice-to-Have Certifications
DP-200 Implementing an Azure Data Solution
DP-201 Designing an Azure Data Solution
AZ-400 Designing and Implementing Microsoft DevOps Solutions
Nice-to-Have Skills/Qualifications:
DevOps on an Azure platform and Jenkins
Experience in docker, Kubernetes and HELM
Experience developing and deploying ETL solutions on Azure
Event-driven, microservices, Containers/Kubernetes in the cloud
Familiarity with the technology stack available in the industry for metadata management: Data Governance, Data Quality, MDM, Lineage, Data Catalog etc.Familiarity with the Technology stack available in the industry for data management, data ingestion, capture, processing and curation: Kafka, Attunity, Cassandra, Spark, Hive, etc.
Multi-cloud experience a plus - Azure, GCP
Professional Skill Requirements
Proven ability to build, manage and foster a team-oriented environment
Proven ability to work creatively and analytically in a problem-solving environment
Excellent communication (written and oral) and interpersonal skills
Excellent organizational, multi-tasking, and time-management skills
Proven ability to work independently
If you are interested in this position, please send us your CV by clicking Apply Now