Skill(s) Requirement: Must have Hands-on experience in Google Cloud Platform (GCP) services, such as: BigQuery, Dataflow, Cloud Functions Experience in handling sizable data warehouse Experience in implementing ETL / data streaming for data analytic Nice to have Infrastructure as code experience, such as CloudFormation, Terraform Experience in utilizing workflow engines such as: Apache Airflow, Google Cloud Composer DevOps tooling, such as Ansible and Jenkins Programming languages (one or more of Python, Kotlin, Scala, Java…) SQL skills and Data analysis experience Experience with microservices frameworks such as Spring Cloud, Quarkus, Micronaut TDD and BDD (test driven development and behavior driven development) Previous experience in data science or machine learning related project is a plus Other Requirements 5 years or more hands on experience in implementing data analytic related systems Able to perform required tasks with minimal supervision Good command of written and spoke English, with strong communication skills to work with partners globally Fluent in Cantonese and English
Role & Responsibilitie: Design and implement data pipeline in cloud environment. Understand data schema of available data sources Utilize technologies such as workflow engine to automate data transfer from sources to destination Design suitable data schema for the data warehouse to enable efficient query and retrieval of data for analytic purposes Automate & implement CI/CD delivery pipeline Implement and maintain cloud-based data pipeline related projects Automate collection and processing of data on the data pipeline
Work Location: Quarry Bay/ Client Site
If interested for the above position, please send detailed resume with current and expected salary as well as date of availability to email@example.com The personal information collected is strictly for recruitment purpose only.