🚀 Founded in 2021, Echo's narrative is one of relentless growth, ambition, and promise. A cutting-edge startup at the forefront of Geospatial Data and Gen AI technologies, we have skyrocketed from 0 to 50+ employees in just two years!
Specialists in the location intelligence industry, we empower innovation by helping companies better understand the world around them.
In a world inundated with data, quality is crucial. We use data science, machine learning, and human feedback to build Europe's most comprehensive, high-quality datasets, enriched with differentiating attributes that outmaneuver the competition.
We aren’t just about reliable data (read: unmatched quality) though.
Life at Echo is about individual growth, collective ambition, and a culture that values and celebrates diversity; With a people first philosophy guiding everything we do.
You'll be at the heart of innovation, tackling projects that shape our future, with the best people possible!
Your Role at Echo
At Echo, you’ll join a dynamic team of over 20 data enthusiasts passionate about leveraging data to drive innovation.
As we continue to expand, we are looking for a motivated Data Engineer to help us scale our operations and build cutting-edge data solutions. You’ll work alongside talented data and ML engineers, developing robust pipelines, designing efficient data systems, and contributing to impactful projects that power the growth of our products and services.
As such, your main responsibilities will be:
- Develop new features in collaboration with other data engineers and MLEs
- Design and build data pipelines using Python and Apache Airflow
- Contribute to the design and development of core modules for the different data teams
- Work with GCP tools such as BigQuery, Dataflow, Pub/Sub, and Dataproc
- Develop and maintain infrastructure as code using Terraform or other similar tools
- Collaborate with other teams to ensure smooth integration of data pipelines and infrastructure
- Optimize data pipelines for scalability, reliability, and performance
- Expose the data through APIs, flat files, etc., for external use
- Design datasets for geo external facing consumers for speed, consistency, cost, and efficiency
- Investigate, test, and implement new tools, processes, and technologies on an ongoing basis
- Participate in code reviews, design discussions, and agile ceremonies
What we’re looking for:
- Fluency in English (French is a plus)
- 3+ years of experience as a Data Engineer or in a related role
- Strong programming skills in Python
- Working experience with Apache Airflow
- Working experience with Google Cloud Platform eco-system (BigQuery, Dataflow, Pub/Sub, Dataproc, etc.)
- Familiarity with Apache Spark or other big data frameworks
- Knowledge of DevOps tools such as Github CI/CD or Terraform is a plus
- Knowledge of monitoring tools such as DataDog or Grafana is a plus
- Basic understanding of Data science: ML/NLP/Image segmentation
- Strong problem-solving and analytical skills
- Excellent communication and collaboration skills
Why us?
- 🌎 A great diverse and international team based in Paris (30+ nationalities!)
- 🚀 Impact and ownership mindset: at Echo we encourage everyone to take ownership of their projects
- 🎇 People first company: without our people we have no business, we have dedicated a People Manifesto to them
- 🌆 A contemporary office space in central Paris with a rooftop to admire the view
- 💻 Remote-friendly policy
- 🥘 Free coffee, snacks on top of your lunch voucher card (Swile, 10€/ working day)
- 🏥 Healthcare insurance
Feeling like you don’t check all the boxes ? We encourage you to apply anyway ! Best case scenario : we start an incredible collaboration, worst case, you only invest a few minutes towards your application. You won’t regret it !
We are committed to providing equal employment opportunities regardless of gender, sexual orientation, origin, disabilities, or any other traits that make you who you are.