All new
Data Science
jobs, in one place.

Updated daily to help you be the first to apply ⏱

avatar4avatar1avatar5avatar3avatar2
Expressions of Interest - Data Engineer
  • Python
  • Spark
  • Databricks
  • SQL
  • Java
  • SAS
  • Machine Learning
  • Big Data
  • Excel
  • ETL
  • Hadoop
  • Cassandra
  • Scala
  • Kafka
  • NoSQL
  • PostGreSQL
  • Azure
KPMG
Canberra ACT
127 days ago
Job no: 509605
Work type: Permanent Full Time
Location: Canberra
  • Do you embrace digital disruption?
  • Do you aspire to create the best customer experiences across Mobile App & Web, and User Experience (UX) & User Interface (UI)?
  • Do you believe in creating powerful actionable insights from Data and Analytics?
  • Do you want to work in a diverse and flexible working environment?

KPMG is one of the most trusted and respected global professional services firms. Through depth of expertise, clarity of insight and strength of purpose we help our clients solve complex challenges, steer change, strengthen, transition and grow. In Canberra, we are a team based practice and this extends to our clients whom we work and collaborate with, in solving complex problems. Together, we design, innovate and implement, providing enduring advice that support our clients and the services they deliver. Our clients vary in size and come from a diverse range of sectors – all sharing in a common goal: to embrace change and deliver services that make Australia a better place. We are looking for talented individuals who would like to join us on the journey.

Why join our Digital Delta Talent Community?

New digital technologies and disruptive business models mean many organisations are struggling to keep pace with the transformative changes required to drive growth and meet customer demands. KPMG Digital Delta provides end-to-end digital innovation and transformation services to help overcome this challenge.

By designing and implementing new fit-for-purpose operating models, KPMG Digital Delta helps organisations to reframe their business models, improve operational productivity, create the best customer experiences, and enhance employee collaboration. We bring together best practice knowledge and technology, along with deep expertise across all industries.

More specifically, we re-imagine and re-invent organisations to become world class digital enterprises using advanced technologies, data and human insights. We help organisations to embrace Digital Strategy, Artificial Intelligence (AI) & Cognitive, the Internet of Things (IoT), Data, Analytics & Modelling, Mobile App & Web, and User Experience (UX) & User Interface (UI) and more.

We work with clients to:

  • Formulate strategies that re-imagine organisations
  • Harness innovation from the 4th industrial revolution
  • Actioning insights from trusted data to consistently and quickly make clear decisions
  • Build adaptive organisations
  • Thrive as a connected enterprise – front, middle and back office

Your new role

The Data Engineer is the designer, builder and manager of the information or data management pipelines, preparing data for analytical or operational use. You have an aptitude for translating business problems into data & infrastructure/resource requirements and solutions. You will design, construct, test and maintain data pipelines to pull together information from different source systems; integrate, consolidate, cleanse and monitoring the data; and structure it for use in individual analytics applications. You will actively ensure the stability and scalability of our clients’ systems and data platforms. You will strive to bring the best of DevOps practices to the world of data by embracing the emerging practice of DataOps. You will work proactively to:

  • Drive a technical roadmap for the team, covering non-functional requirements such as scalability, reliability and observability
  • Assess new and existing data sources on their applicability to address the business issue and translate the outcomes of the analytical solutions we design in the context of business impacts and benefits.
  • Design, construct, install, test and maintain highly scalable, resilient, recoverable data management systems
  • Recommend ways to improve data reliability, efficiency and quality in our data pipelines by applying DataOps principles. Implement monitoring systems to proactively detect unexpected variation in our data pipelines.
  • Ensure delivered systems meet business requirements and industry practices for automating build deployment and change management using and DevOps and CI/CD patterns
  • Understand, explain and evangelise buzz words such as serverless, cloud native and PaaS and how they impact the design of Data Pipelines
  • Be comfortable with code or tool based data pipelines and understand the pro and cons of each
  • Work closely with Digital Delta Data Scientists to extract and manipulate data from a variety of sources and subsequently cleanse, standardize, scale, bin, categorise, tokenise, stem and transform it in order to get the data into a state suitable for further analysis.
  • Work with our Data Scientists to design, develop and implement their algorithms and models.
  • Design, develop and implement the automated approach for productionising model scoring and the closed loop feedback paths required to support test and learn.
  • Select and configure analytics toolsets considering the clients’ business issue and analytic maturity.

In addition to your focus on client engagements, you will contribute to the definition and enhancement of data engineering and DataOps disciplines within the practice.

How are you Extraordinary?

  • A proven ability to undertake the responsibilities and requirements of the role, as listed above.
  • Excellent interpersonal, oral and written communication skills, with a knack for distilling complex and/or technical information for novice audiences.
  • A proven ability to develop and manage enduring client relationships, engendering a sense of trust and respect.
  • Demonstrable industry knowledge; understanding the way your primary industry functions and how data can be collected, analyzed and utilized; maintaining flexibility in the face of cloud and data industry developments. Experience in financial services, telecommunications and retail is not mandatory but highly regarded.
  • A disciplined approach to problem solving and an ability to critically assess a range of information to differentiate true business needs as opposed to user requests.
  • Experience with a range of technical skills that could include:
  • Knowledge of architecting and engineering cloud-based data solutions with the following products AWS Redshift/RDS, S3, EC2, Lambda, EMR, Glue, DynamoDB, Athena, Kinesis – or equivalents in Azure or Google Cloud Platform, : Databricks, Snowflake, with a particular focus on serverless and cloud native solutions
  • Big Data technologies such as Hadoop, Spark Streaming, Flink, Hudi, Storm, NiFi, HBase, Hive, Zepplin, Kafka, Ranger, Ambari.
  • Programming languages such as Java, Node, Go, Python, Scala, SAS, R.
  • ETL tool experience and/or Code based data pipeline experience
  • Experience with DevOps principles and tools, including:
  • Agile enterprise development environments, CICD implementation, continuous testing, Cloud resource management (cloudformation, terraform, azure ARM etc..), automation of environment deployment and automated shakeout testing.
  • Continuous Integration/Delivery tools such as Jenkins, AWS code*, Azure DevOps, Bamboo, Cloud Build, Spinakker, Sonarqube, uDeploy or similar
  • Deployment automation tools such as OpenShift, Kubernetes and Docker.
  • Version control for data, low-level hardware and software configurations, and the code and configuration specific to each tool in the chain.
  • A proven ability to:
  • Build resilient, tested, data pipelines with statistical data quality monitoring embedded (DataOps)
  • Extract knowledge, or insight, from structured and structure data.
  • Work with an existing lifecycle management framework to collect metadata, follow coding standards, use version control, complete documentation and write and execute unit tests.
  • Determine the appropriate approach including data collection methods, sampling methods, sample sizes and data processing pipelines to formulate, execute and analyse a sound and reproducible experiment. Including the ability to recognise and construct a closed loop feedback system.
  • Learn patterns and extract answers from data using algorithms that can build a model based on input data without being explicitly programmed to do so.
  • Apply techniques of statistical inference to test hypotheses and derive estimates of population statistics from sample data.
  • Appropriately communicate discovered information to consumers, clearly using visual variables shape, colour, hue, orientation, etc.
  • Experience with SQL-based technologies (e.g. PostgreSQL and MySQL) and NoSQL technologies (e.g. Cassandra and MongoDB)
  • Data warehousing solutions and architectures,
  • Data modelling tools (e.g. ERWin, Enterprise Architect and Visio)
  • High-level understanding of statistical analysis and modelling, predictive analytics, text analytics and other machine learning applications
  • A sound understanding of digital and cognitive technologies and analytics, information management and business process based solutions.
  • A disciplined approach to problem solving and an ability to critically assess a range of information to differentiate true business needs as opposed to user requests.
  • An eagerness to solve complex problems in environments that are often ambiguous, technologically challenged and require creative and lateral thinking.
  • An ability to work within a multidisciplinary team to seek and provide requirements to team members responsible for different pipeline areas.

Please note you should also be an Australian Citizen or have the ability to obtain a government security clearance.

The KPMG Difference

Our people are focused on creating a diverse and dynamic environment that embraces and values differences. We value the variety of unique experiences, qualities and characteristics our people possess and we share and learn from each other.

We are proud to be consistently recognised as an employer of choice for women, and for our achievements in LGBT+ workplace inclusion.

Our commitment to ‘Flexibility’ allows our people to manage the changing demands of work, personal or family life. Explore the links below to hear our people share their experience @ KPMG:

Flexibility empowers wellbeing

Flexibility enables contribution to the community

Flexibility inspires technology & innovation

Flexibility supports family


Make KPMG the clear choice for your career and be Extraordinary!


Advertised: 17 Jun 2020 AUS Eastern Standard Time
Applications close:

    Related Jobs

  • Data Engineer - Spark/Kube - 12 MONTH CONTRACT

    • Spark
    • ETL
    Opus Recruitment Solutions
    Sydney NSW
    1 day ago
  • Data Scientist

    • Machine Learning
    Woolworths Group
    Bella Vista NSW 2153
    7 days ago
  • Senior Data Scientist

    • SQL
    • Machine Learning
    • Python
    Origin Energy
    Adelaide Region SA
    3 days ago
  • Data Engineer - MS, Azure Cloud

    • SQL
    • Power BI
    • Azure
    Talent International
    Adelaide Region SA
    Today
  • Category Development Data Analyst (Secondment)

    • SQL
    • Data Analysis
    Woolworths Group
    Carlingford NSW 2118
    24 days ago