Role:
You will be involved in one of the largest data transformation journeys. As a data engineer, you will work on building data projects within the framework of the Data Mesh concept, based on the defined target vision and requirements.
We value a diverse range of technical backgrounds, and we believe you will thrive here if you are passionate about data. In this role, you will need to implement data-intensive solutions for a data-driven organization.
You will join the Data Engineering competence area within AI (Artificial Intelligence), Analytics & Data Domain and be an individual contributor in one of the data product teams. The area supports all our brands globally in creating, structuring, protecting, and ensuring that data is accessible, understandable, and of high quality.
Essential requirements:
- Experience in data query languages (SQL or similar), BigQuery, and different data formats (Parquet, Avro)
- Take end-to-end responsibility to design, develop and maintain the large-scale data infrastructure required for the machine learning projects
- Have the DevOps mindset and principles to manage CI/CD pipelines and terraform as well as Cloud infrastructure, in our context, it is GCP (Google Cloud Platform).
- Leverage the understanding of software architecture and software design patterns to write scalable, maintainable, well-designed, and future-proof code
- Work in cross-functional agile team of highly skilled engineers, data scientists, business stakeholders to build the AI ecosystem
Please explain how you meet all the requirements when applying.
Utilization: 100
%
Location: Stockholm
Period: 01-10-2024-28-02-2025
Last day to apply: 25-09-2024
We present regularly. This means that we sometimes remove the assignements from our website before the final application deadline. If you are interested in an assignement, we recommend that you submit your application as soon as possible.