Diconium is a specialist in the fields of data and artificial intelligence. Whether in the areas of search, social and content, personalization and analytics, data engineering or data science - our expertise helps our customers to collect the right data at the right time, to forecast services and offers and thus to make data-driven decisions. Become a part of the diconium data family and support us on our Data & AI journey!
What you can expect
- You will design and develop scalable data pipelines to efficiently process and analyze large volumes of data.
- You will evaluate and implement new technologies and tools in the area of data engineering to continuously improve the efficiency and performance of our data processing processes.
- Part of your job is to support our customers in easily and transparently obtaining the necessary data to develop smart data products in various domains (mobility, automotive, industrial, consumer, financial and non-profit).
- You will work closely with Data Scientists and Analysts, identifying potential data sources and evaluating their utility to understand data requirements and develop appropriate solutions.
- Additionally you will share your expertise and experience with other team members, contribute to knowledge sharing and development of the team, and take a technical leadership role within the team.
- Finally will develop and maintain documentation on data engineering processes, best practices and technologies.
This is what you bring along
Must Haves
- Relevant work experience of at least 5 years in Data Engineering
- Sound knowledge of at least one programming language (such as Python, Java or Scala) and experience with Big Data technologies (such as Hadoop, Spark or Kafka)
- In-depth knowledge of batch and real-time data processing frameworks such as Apache Spark, Apache Flink or similar
- Hands-on experience with cloud-based data platforms (Azure, AWS, or GCP), understanding of data modeling & data architecture and experience with Databricks
- Experience in providing technical leadership and advising clients on technical concepts
Nice To Have
- Knowledge of developing machine learning pipelines and models
- Experience in project management using agile methodologies such as SAFe, Scrum or Kanban
- Familiarity with DataOps practices and DevOps concepts
- Experience implementing data quality and data governance frameworks