Job description

Data Engineer (Ingestion)

Hybrid Working (6 days per month onsite in our Farringdon office)



Who are we?


Toyota Connected Europe wants to create a better world through connected mobility for all. We are a new company created to bring big data and a customer focus into all aspects of the mobility experience, so everyone’s experience is more personal, convenient, fun and safe. We create and enable technologies to delight and simplify the lives of everyone who use our products and empower them to think of and use our services in new ways.


You will be joining us at the beginning of Toyota Connected Europe’s journey of building our team and products. We are building teams to inspire, innovate and build technologies and products that are used by millions of people from all walks of life. We want every member of our team to live and breathe the start-up culture of Toyota Connected Europe and feel and act like an owner every day. This is an opportunity to have an immediate impact and voice: what you create today, you will see being used tomorrow.



About the role:


The Data Engineering team enables and manages the ingestion of low latency, high volume car telemetry data that powers our engineering and data science teams to build smart and insightful products. We are looking for an experienced Data Engineer to join the team who will have a key role in the design, development, implementation and documentation of large-scale, distributed software data applications, systems and services. You will help to engineer data pipelines which will enable our vehicles to communicate to the cloud. The features you build will power driving experiences across the world.



What you will do:


  • Work closely with Data Engineering Lead, Senior Engineers and Product team to deliver features to customers and thrive as a creative thinker that can break out from conventional solutions.
  • Adopt modern principles, techniques and technology to the team, raising software quality, value and delivery.
  • Perform within engineering best practices.
  • Implement, and maintain complex data engineering solutions to acquire and prepare data. Create and maintain data pipelines to connect data within and between data stores, applications and organisations.
  • Design, code, verify, test, document, amend and refactors complex programmes/scripts and integration software services.
  • Apply agreed standards and tools to achieve well-engineered outcomes.
  • Work side-by-side with other talented engineers in a team-oriented, agile software engineering environment.
  • Love writing code and learning to constantly hone your craft as an engineer
  • Work closely with product owners to shape and deliver features to customers



Our Tech Stack:


Please note that you do not need to be familiar with all of them as we acknowledge that in Technology there always is a learning curve.

  • Cloud Providers
  • Primarily AWS, although we still have some legacy services running on Azure.
  • Languages
  • Java 11+
  • Kotlin (Legacy)
  • Messaging Stacks
  • Kafka
  • Pulsar – slowly migrating those back to Kafka
  • Deployment Environment
  • Kubernetes (EKS)
  • Frameworks
  • Spring
  • Apache Flink
  • Kafka Streams
  • Apache Storm (mostly legacy)
  • Repositories and CI/CD:
  • Gitlab
  • Gitlab CI/CD
  • Data Stores
  • MongoDB



About you:


  • Expertise in one of the major real time data processing frameworks, such as Flink or Kafka Streams
  • Experience of building event driven and/or streaming data services, IoT domain would be great but not essential
  • Strong programming experience in Java (11+) and show a sense of ownership and pride in your code; make us believe you will excel. Experience with testing frameworks JUnit5, Mockito or Spring Integration
  • Strong database skills and experience is required, we have NoSQL databases as well as relational databases in use often with large data volumes.
  • Strong data modelling concepts and principles having extensive experience of building data architectures consolidating multiple complex sources
  • Experience of modern software and data engineering patterns, including those used in highly scalable, distributed, and resilient systems.
  • Excellent knowledge of and experience working with APIs (designing with Open API is desirable) and web services, CI/CD pipelines and automated testing (BDD, Performance, Security), Kubernetes and cloud native practices, containerized workloads with tools such as Docker
  • Experience developing microservices-based architectures, including distributed messaging patterns
  • Experience developing and delivering systems on at least one major public cloud provider; preferably AWS
  • Passion for agile practices, DevSecOps, incremental delivery, continuous improvement and ability to cultivate a strong team culture
  • We would like a self-starter - someone who would reach out to other teams as needed to seek answers and fostering an agile environment
  • Willingness to get involved in problem resolution and initiatives to smooth operational maintenance of production services which are spread across geographical boundaries
  • We think the knowledge acquired earning an BS in Computer Science, Engineering, Mathematics, or a related field would be of excellent value in this position, but if you are smart and have the experience that backs up your abilities, for us, talent trumps a degree every time



Equal Opportunities, Inclusion & Diversity We’re committed to building a diverse and inclusive group of talent with a broad range of backgrounds, skills and capabilities and will give full and fair consideration to all applicants. We know that flexibility is key to success and our people work flexibly in many ways, so if this is important to you, please let us know. If you have a disability or any other additional need that requires consideration, accommodation or adjustment to the role or recruitment process, please do let us know.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.