N Brown Group

Data Engineer

Job description

We are looking for a Data Engineer to help bring data to the heart of N Brown. You will be part of Group Data and will focus on exploring solutions to maintain, improve, and scale our use of data. Working with our Lead Data Engineer you will provide a valuable contribution in building a best in class modern Data Analytics Platform.

What will you do as a Data Engineer at N Brown?

  • Join a team of engineers to create, maintain, and extend our analytics platform.
  • Take ownership of the data processing technical domain, including implementing frameworks that will allow N Brown to scale their data processing. You will work with the Lead Data Engineer to ensure we are delivering the very best technologies that deliver value for our business.
  • You will drive the adoption of strong CICD practises to reduce risks to deployments.
  • Help develop your colleagues capabilities in software development to better enable them to support the analytics platform. Champion test-driven quality first engineering practises.
  • Working with governance you will drive solutions that are scalable and robust and ensure we deliver quality data for our customers.
  • You will be working in an agile operating model with cross-functional teams.
  • Work with our stakeholders whilst delivering the new Analytics Platform, ensuring the platform is reliable, scalable, and secure. You will take a proactive approach to monitoring.

What skills and experience will you have?

  • A strong communicator, able to bring complex technical concepts to life for business stakeholders and equally convey business needs effectively to technical audiences.
  • Experience building, testing and delivering automation components in an OOP manner ideally using Python.
  • Experience delivering large-scale transformation projects with focus on defining end-to-end data architecture.
  • An experienced engineer with a deep interest in how data architectures can improve experiences and drive better business decision making.
  • Automating delivery of Infrastructure as Code (IaC) via Terraform.
  • Deep understanding of CI/CD pipeline
  • Experience working with the Google Cloud Platform is essential
  • Curiosity to experiment and improve, whilst staying aligned on outcomes.
  • Understanding of Agile in principles and practices and tools like Jira and Confluence.

Software and Tech Experience

  • Experience with Google Cloud products
  • Experience with open-source data-stack tools such as Airflow, Airbyte, DBT, Kafka etc.
  • Awareness of data visualisation tools such as PowerBI, Tableau and/or Looker
  • Knowledge of Teradata, Mainframe and/or Google Analytics is beneficial.
  • Appreciation of data governance, data management, analytics, science, and visualisation workflows and data needs
  • Appreciation of the modern cloud data stack, headless BI, analytics engineering, data meshes and lake-houses

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.