Role:
Data curation, validation, transformation (ETL/ELT), pipeline monitoring, working with different sources, building batch and streaming pipelines on google cloud and support Data/Business Analysts in creating and maintaining dashboards.
Essential requirements:
- Advanced Python programming for Data Engineering, Strong SQL, Ability to build and maintain batch and steaming pipelines on GCP using Bigquery, Cloud SQL, Dataflow, Pubsub, Cloud functions, Airflow and other services, Agile WoW, Worked part of a Product team, Softskills and Communication, Github, Data Modelling.
Please explain how you meet all the requirements when applying.
Utilization: 100
%
Location: Malmö
Period: 26-02-2024 - 31-08-2024
Last day to apply: 14-02-2024
We present regularly. This means that we sometimes remove the assignements from our website before the final application deadline. If you are interested in an assignement, we recommend that you submit your application as soon as possible.