Efficio

Data Engineer - Lisbon (Internal Team)

Job description

At Efficio, we're not just a consulting firm; we're a team of dedicated experts committed to transforming procurement and supply chain functions. Our mission is to drive sustainable, measurable value for our clients through innovative solutions and deep industry insights. Join us and be part of a dynamic environment where your skills and ideas make a real impact.

Efficio is the world’s largest specialist procurement and supply chain consultancy, with offices across Europe, North America, and the Middle East. We’re a diverse team of over 1,000 individuals, representing more than 60 different nationalities and speaking 40+ languages – and we’re continuing to grow rapidly!

We believe we can make the world a better place by helping businesses buy better. Buying better isn't just about saving money; it's also helping businesses to make the world around them better by buying products and services that that are from green, ethical, sustainable, diverse and inclusive suppliers. At Efficio, we do that by being the world's largest procurement and supply chain consultancy firm. We also do that by combining our procurement expertise with our powerful technology and data to help our customers make better purchasing decisions. Data is a fundamental ingredient to that decision making process and we're now investing heavily in building our in-house data skills, expertise and infrastructure to rival the best in the world. So, if you'd like to make the world a better place by helping businesses buy better, apply here. Please note this role is a non-consulting position.

In this job you will develop and implement data pipelines on AWS, maintain and develop them further under guidance of business owners. You will work alongside Data Scientists, Data Engineers and others; integrated in our Product and Digital team. This role offers the chance to join Efficio at a pivotal point in our data journey, and to shape and influence our data landscape.

This role gives you the opportunity to:

  • Collect business requirements and translate them into robust and scalable solutions
  • Demonstrate an understanding of DataOps and develop use case agnostic ETL pipelines
  • Take a proactive approach to work and become a go-to expert for data cloud migration
  • Work with state-of-the-art tooling on AWS including Redshift, S3 and ECS
  • Collaborate with our infrastructure engineers to provision the next generation of tools
  • Develop a sense of ownership to deliver high quality outcomes for the business
  • Make complex data more accessible, understandable, and usable by the organisation

We'd love to hear about any additional skills or experiences you bring to the table. We're particularly interested in:

  • An advanced degree in software engineering, computer science, a related technical field, or relevant work experience
  • Proven experience in building robust and reliable data pipelines, dealing with a wide variety of data sources
  • Strong experience in Python, knowledge of data wrangling methodologies and common packages
  • Proficiency with AWS Tools (e.g., S3, Athena, Glue, Lamba, Kinesis) as well as AWS CI/CD and readiness to learn/work with them
  • Experience with Docker, Airflow, and SQLs (ideally Postgres)
  • Fundamental understanding of programming landscape (e.g., APIs, SQL, database management)
  • Great communication skills, comfortable working collaboratively in cross-functional teams
  • Ability to manage own workload with a customer focus and solution orientated can-do attitude

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
Mastercard
October 9, 2024
Banco Bradesco
October 9, 2024

DATA ENGINEER III

Aditya Birla Group
October 9, 2024

Data Engineer