At Engel & Voelkers we see Data as an essential strategic competency to deliver best in class experience for thousands of our customers, real estate agents and internal stakeholders. Do you want to play an instrumental role as a Senior Data Engineer (m/f/d) and help us to excel in that competency?
As a Senior Data Engineer (m/f/d), you will be part of the Data Warehouse team which is driving the digital transformation at Engel & Völkers by establishing a single source of truth for operational, financial and marketing data to integrate, transform and manage diverse datasets in our Bigquery DataWarehouse. While promoting self-service analytics by enabling our stakeholders globally to access highly curated datasets via Tableau, the team also supports with advanced analytics solutions through ongoing consultancy and training.
WHAT’S IN IT FOR YOU?
- The best of both worlds: a modern prop tech company backed by Engel & Völkers as a real estate company, with more than 40 years in more than 15 different countries.
- Agile working culture with flat hierarchies and cross functional teams
- The chance to contribute directly to the company's success with your skills and experience and to add your own personal touch
- New challenges that allow you to grow professionally and personally
- Technology driven environment based on the latest tech stack: Golang, Kotlin, Node.js, React, Next.js, Nx, dbt, BigQuery, GCP, Docker, Kubernetes etc.
- Flexible working hours and 30 days of vacation
- 60/40 ratio - Remote work and in our beautiful HQ near the Hamburg harbor
A comprehensive benefit package:
- Individual training budget or independent learning via our Udemy account
- Subsidy for your HVV ticket, gym, pension plan
- 25€/month company credit card
- And - you probably waited for it - we also have a fruit basket
WHAT YOU'LL DO?
- Contribute to the company’s data strategy by designing, deploying & operating data infrastructure
- Responsible for building the data pipelines & data orchestration infrastructure - for real-time and batch
- Continuously improve and monitor data collection while ensuring high-level of service level agreements
- Develop and maintain governed data processes documentation - along with company-wide data dictionaries
WHO YOU ARE?
- Solid background in data analysis with a strong interest in the product side and analytics
- Proficiency in Python and excellent skills in SQL - ideally experience with dbt
- Keen on modern DevOps and DataOps principles -experience with Gitlab CI & Terraform or similar
- Proficiency in cloud infrastructure and toolchain, specifically with GCP
- Experience with data streaming technologies, like Google PubSub
- Familiarity with ETL Tools like Airflow, Airbyte or Talend
NICE TO HAVES
- Proficiency in Java or Scala is a plus
Sounds like a match? We would love to talk!
Submit your application today and let’s start the conversation.
Best regards,
Jenny