Join us as a Data Engineer
- This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences
- You’ll be simplifying the bank by developing innovative data driven solutions, using insight to be commercially successful, and keeping our customer's and the bank’s data safe and secure
- Participating actively in the data engineering community, you’ll deliver opportunities to support the bank’s strategic direction while building your network across the bank
What you'll do
As a Data Engineer, you’ll play a key role in driving value for our customers by building data solutions. You’ll be carrying out data engineering tasks to build, maintain, test and optimise a scalable data architecture, as well as carrying out data extractions, transforming data to make it usable to data analysts and scientists and loading data into data platforms.
You’ll also be:
- Developing a comprehensive knowledge of the bank’s data structures and metrics, advocating change where needed for product development
- Building automated data engineering pipelines through the removal of manual stages
- Practicing DevOps adoption in the delivery of data engineering, proactively performing root cause analysis and resolving issues
- Collaborating closely with core technology and architecture teams in the bank to build data knowledge and data solutions
- Developing a clear understanding of data platform cost levers to build cost effective and strategic solutions
- Sourcing new data, using the most appropriate tooling and integrating it into the overall solution to deliver for our customers
- Working in an agile way within multi-disciplinary data and analytics teams, to achieve agreed project and scrum outcomes
The skills you'll need
To be successful in this role, you’ll need a good understanding of data usage and dependencies with wider teams and the end customer, as well as experience of extracting value and features from large scale data.
You’ll also demonstrate:
- Experience of ETL technical design, including data quality testing, cleansing and monitoring, QA and documentation, data warehousing and data modelling capabilities
- Experience in Amazon Web Services, Snowflake, SQL, Informatica, Apache Airflow and Agile methodologies
- Experience in Python, Kafka and Streamsets
- Good knowledge of modern code development practices
- Strong communication skills with the ability to proactively engage with a wide range of stakeholders
- Good critical thinking and proven problem solving capabilities