Job description

JOB OVERVIEW
Career Area: Corporate & Investment Banking
Location: Rosebank, Gauteng, South Africa
Job Type: Full-Time Regular
Job ID: 61970
JOB PURPOSE
Provide infrastructure, tools and frameworks used to deliver end-to-end solutions to business problems. Build scalable infrastructure for supporting the delivery of clear business insights from raw data sources; with a focus on collecting, managing, analysing, visualising data and developing analytical solutions.Responsible for expanding and optimising Standard Bank's data and data pipeline architecture, whilst optimising data flow and collection to ultimately support data initiatives.
KEY RESPONSIBILITES

  • Create and maintain optimal data pipeline architecture and creating databases optimized for performance, implementing schema changes, and maintaining data architecture standards across the required Standard Bank databases.
  • Assemble large, complex data sets that meet functional / non-functional business requirements and align data architecture with business requirements.
  • Build analytics tools that utilise the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Designing and developing scalable ETL packages from the business source systems and the development of ETL routines in order to populate databases from sources and to create aggregates.
  • Responsible for enabling and running data migrations across different databases and different servers and defines and implements data stores based on system requirements and consumer requirements.
  • Proactively analyses and evaluates the Standard Banks databases in order to identify and recommend improvements and optimisation.
  • Analyse complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models.
  • Acts as a subject matter expert from a data perspective and provides input into all decisions relating to data engineering and the use thereof
QUALIFICATIONS

  • Degree in Information Technology
  • 5-7 Years experience with big data tools, relational SQL and NoSQL databases, data pipeline and workflow management tools, AWS cloud servicesstream-processing systems and object-oriented/object function scripting languages.
  • 5-7 Years experience building and optimizing ‘big data’ data pipelines, architectures and data sets and performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • 5-7 Years building processes supporting data transformation, data structures, metadata, dependency and workload management and successful history of manipulating, processing and extracting value from large disconnected datasets.
  • The Standard Bank Group has implemented a Vaccination Policy for all roles which require the incumbent to work from the Standard Bank premises on a full-time or intermittent basis. Full vaccination against COVID-19 is therefore an inherent requirement of this role.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.