- Growing Cloud and Data Platform Team|Leading Banking/ Financial Institution
About Our Client
We are responsible for designing and building our platforms and then working with our business users to engineer cloud and data services to meet these and future use cases. The domain comprises of four core pillars; Data Engineering, Cloud Platform, Data Platform and Analytic & Visualisation.
The Successful Applicant
- Build reusable data pipelines at scale, work with structured and unstructured data, and feature engineering for machine learning or curate data to provide real time contextualise insights to power our customers journeys
- Use industry leading toolsets, as well as evaluating exciting new technologies to design and build scalable real time data applications.
- Spanning the full data lifecycle and experience using mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you'll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data
- Help in adopting best engineering practices like Test Driven Development, code reviews, Continuous Integration/Continuous Delivery etc for data pipelines
- Mentor other engineers to deliver high quality and data led solutions for our Bank's customers
- Being a team player who can build relationships and work productively with other teams across a variety of domains
What's on Offer
- Bachelor's degree (or higher) in mathematics, statistics, computer science, engineering, data analytics or related field
- Best practice coding/scripting experience developed in a commercial/industry setting (Python, SQL, Java, Scala or Go) Working experience with operational data stores, data warehouse, big data technologies and data lakes (Teradata DW, BigQuery)
- Experience working with relational and non-relational databases to build data solutions, such as SQL Server/Oracle/Teradata, experience with relational and dimensional data structures
- Experience in using distributed frameworks (Spark, Flink, Beam, Hadoop)
- Good knowledge of containers (Docker, Kubernetes etc) and experience with cloud platforms such as GCP, Azure or AWS
- Strong experience working with Kafka technologies
- Clear understanding of data structures, algorithms, software design, design patterns and core programming concepts Good understating of cloud storage, networking and resource provisioning
You'll enjoy flexible working opportunities, a strong sense of community and well-being, and a collective mission to promote the good of the people of the firm. We offer competitive remuneration package and comprehensive fringe benefits including medical and life insurance, and different types of allowances to the right candidate.
Contact: Royce Chan
Quote job ref: JN-092023-6181086