ANZ

Data Engineer

Job description

About Us

At ANZ, we're applying new ways technology and data can be harnessed as we work towards a common goal: to improve the financial wellbeing and sustainability of our millions of customers.

About The Role

This role works in the Wholesale Credit and Corporate Finance (WCCF) Tech Area for the bank’s Institutional Division. The WCCF Tech Area builds world-class technology for wholesale credit, capital management, corporate finance, ESG (Environmental, Societal, Governance), CRM, as well as data and insights.

This role is suitable for a junior to mid-level professional with existing experience in data engineering. The position offers an opportunity to expand your skills and deepen your expertise in the field.

You will work on large-scale data management concerns in a dynamic and complex environment. In this role, you will provide your expertise in designing, implementing, and executing a robust, scalable, and cost-effective shared-services data platform. You will have an appreciation, understanding and working knowledge of modern data architecture paradigms (Data Mesh).

The role will also support innovation by monitoring emerging technologies/methods and championing those that will add value. This role supports a divisional data mesh and all data and analytics use cases for Institutional.

It reports to the Data Engineer Lead for IDAP.

What will your day look like?

  • Collaborating with the Lead Data Engineer and/or Data Architect to understand the demands on the data platform, elaborating on these to produce solution options.
  • Work with domain and data platform teams to analyse data sets, and design and build data pipelines.
  • Analyse and organise raw data from domains, including combining and aggregating raw data from various internal and external sources.
  • Building models to transform data to suit the data querying needs of consumers.
  • Collaborating with data and cloud engineers to develop solution options, and maintaining key design artefacts, e.g., Solution Design overview.
  • Building relationships with key business and technology stakeholders, providing technical input and support in their quest to contribute towards an Institutional and enterprise data mesh.
  • Supporting the evaluation of new technologies to support capability gaps and performance improvements.


What will you bring?

  • Good understanding and experience working with big data and associated concerns (data governance, scalability, security, observability, reliability, etc).
  • Experience in designing and building data pipelines using AWS technologies like Lake Formation, MWAA, EMR, S3, and Athena.
  • Experience with data warehousing technologies (AWS Redshift) and data storage formats (Parquet, Avro, JSON) and table formats (governed tables, Iceberg).
  • Experience with data transformation and testing tools and frameworks (dbt, Great Expectations, Soda)
  • 4+ years of industry experience in data and/or software engineering, data science, business intelligence, or related fields.
  • Good proficiency in programming (Python, Rust, GoLang or Java, with strong SQL knowledge).
  • Relevant AWS Cloud Certifications.
  • Preferable - Data modelling experience to support business requirements for data usage.
  • Comfortable with IaC and CI/CD tools (Terraform, Codefresh)
  • Experience developing data visualisations using tools such as Tableau, Qlik, or similar.
  • Proven skill in communicating and collaborating with analysts and engineers within the team.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
Kappahl Group
October 12, 2024

Data Scientist/Data Engineer

Capitec
October 12, 2024

Product Data Engineer I

Sky Portugal
October 12, 2024

Data Engineer