Intermediate Data Engineer

Location: Johannesburg, Gauteng

*** Mention DataYoshi when applying ***


  • Assist in designing and implementing scalable and robust processes for ingesting and transforming datasets.
  • Design, implement and support the creation and maintenance of data pipelines from a multitude of sources.
  • Ingest large, complex data sets that meet functional and non-functional requirements.
  • Enable the business to solve the problem of working with large volumes of data in diverse formats, and in doing so, enable innovative solutions.
  • Design and build bulk and delta data lift patterns for optimal extraction, transformation, and loading of data.
  • Supports the organisations cloud strategy and aligns to the data architecture and governance including the implementation of these data governance practices.
  • Engineer data in the appropriate formats for downstream customers, risk and product analytics or enterprise applications.
  • Development of APIs for returning data to Enterprise Applications.
  • Assist in identifying, designing and implementing robust process improvement activities to drive efficiency and automation for greater scalability. This includes looking at new solutions and new ways of working and being on the forefront of emerging technologies.
  • Work with various stakeholders across the organisation to understand data requirements and apply technical knowledge of data management to solve key business problems.
  • Provide support in the operational environment with all relevant support teams for data services.
  • Create and maintain functional requirements and system specifications in support of data architecture and detailed design specifications for current and future designs.
  • Support test and deployment of new services and features.


  • Matric, with a degree in Computer Science, Business Informatics, Mathematics, Statistics, Physics or Engineering.
  • 3+ years of data engineering experience
  • 3+ years of experience with any data warehouse technical architectures, ETL/ELT, and reporting/analytics tools including, but not limited to, any of the following combinations (1) SSIS and SSRS, (2) SAS ETL Framework, (3) SAP ETL Framework, (4) MongoDB ETL deployments, (5) Apache Spark and Apache Hive deployments will be beneficial.
  • AWS knowledge and skills: Glue, S3, DynamoDB, Lambda, IAM, Cloudformation
  • The candidate having DBA ability and knowledge across at least 2 platforms (example: TSQL, SAS, PSQL, IBM VSAM and DB2 etc.) will also be beneficial.
  • Some experience with the Python programming language.
  • Experience with designing and implementing Cloud (AWS) solutions including use of APIs available.
  • Some experience with Dev/OPS architecture, implementation and operation would be advantageous.
  • Knowledge of Engineering and Operational Excellence using standard methodologies. Best practices in software engineering, data management, data storage, data computing and distributed systems to solve business problems with data.
  • Some experience in applying SAFe/Scrum/Kanban methodologies would be advantageous.
  • Knowledge and understanding of business process management lifecycle which covers the design, modelling, execution, monitoring, and optimization as well as business process re-engineering.
  • Good problem-solving skills: The ability to exercise judgment in solving technical, operational, and organizational challenges, to identify issues proactively, to present solutions and options leading to resolution

*** Mention DataYoshi when applying ***

Offers you may like...

  • 2U

    Intermediate Data Engineer
    Cape Town, Western Cape
  • Indsafri

    Intermediate Data Engineer
    Johannesburg, Gauteng
  • Datonomy Solutions

    Intermediate Data Engineer
    Pinelands, Western Cape
  • LSEG (London Stock Exchange Group)

    Data Analyst with intermediate French
    Gdynia, pomorskie
  • LSEG (London Stock Exchange Group)

    Data Analyst with intermediate Nordic languages
    Gdynia, pomorskie