Propose and execute (from the data & analytics team) the planning, development, enrichment and adoption of the Data & Analytics cloud platform and the diversification of access to new data sources.
Additionally, the role will deliver a range of Data & Analytics project that would include occasionally dashboarding, automation or machine learning
Regarding the deployment of the Data & Analytics Cloud platform:
Develop together with the Technology team the cloud migration strategy (on premises to cloud), lead and coordinate its process and implementation from the business-side of the business (Data & Analytics team)
Define, develop, maintain, document and enrich the pipelines and data repositories in the cloud platform (CI/CD preferred)
Identify and integrate on an ongoing basis new data sources into the centralised data platform
Prepare the architecture / blueprints for modelling work and live interaction with the company’s apps (insurance, wellness and healthcare apps) and work with the data scientists to put to production the models on the cloud platform.
Review and propose optimisation for the cost-efficiency of the platform (compute and storage).
Train the team in the usage of the cloud platform and work towards a change in mindset within the team for the adoption of the new tools and platform
Regarding the development of the Data & Analytics team:
Lead end-to-end the delivery of Data & Analytics projects that may involve a variety of expertise (data architecture, data engineering, data science, insurance business acumen, healthcare business acumen, etc.)
Participate in the training and coaching of the team, as well as building out their career path
Proactively manage stakeholders and manage the delivery of projects by the team
Academic / Professional qualifications
Degree holder would be preferred in relevant field (Information management, Computer Science, Business Intelligence, Statistics, Data Science, Actuarial science, etc.)
Ideal experience and delivered performance
At least 4 years’ experience in insurance, healthcare and/or consulting, and with strong hands-on experience on cloud platform (Azure preferred).
Hands-on experience developing enterprise grade ETL / ELT data pipelines on batch or stream for cloud platform (Azure Data Factory required, Kafka / Event hub a plus, others are appreciated)
Strong knowledge of a cloud ecosystem (Azure preferred, AWS, GCP appreciated) and experience building and deploying solutions to Cloud
Deep understanding of data manipulation/wrangling techniques
Experience and knowledge of application Containerisation, Docker, Kubernetes, etc
Strong programming skills in Python, SQL, Java, Scala,
Experience of working CI/CD technologies, Git, Jenkins, Spinnaker, GCP Cloud Build, Ansible, etc
Advanced certification in a Cloud solution (Azure, AWS, GCP) would be a plus
Empathy, integrity and servicing mindset
Project and stakeholder management (consulting appreciated)
Organisation, self-prioritisation, autonomy and proactivity
Business fluency in English, and other local language would be a plus
Demonstrable knowledge on NoSQL Databases, MongoDB, Dynamo DB/Neo4j/Elastic & Snowflake Data Warehouse/Platform & Streaming technologies and processing engines (Kinesis, Kafka, Pub/Sub and Spark Streaming)
Bupa offers 5 days’ work per week and comprehensive remuneration packages including base salary, study assistance plan, company pension plan, life and medical benefit, dental benefit, annual leave, examination leave, etc.
Bupa is an equal opportunity employer and welcomes applications from qualified candidates. Information provided will be treated in strict confidence and only be used for consideration of application with Bupa.
Personal data collected will be used for recruitment purposes only. Only candidates selected for interviews will be contacted. Bupa will be in touch for any opportunities that matches your profile. All personal data of unsuccessful application will be destroyed 24 months from the date of receiving the application.