Deriv

Senior Data Engineer

Job description


We are the Business Intelligence team. We lead the organisation in cultivating a data-driven culture as our company moves towards the future. We collect meaningful data and analytics for us to deeply understand our consumers and build more valuable products and services. What we do is incredibly important in driving smart marketing decisions, optimising our business, and increasing profitability.

As a Senior Data Engineer at Deriv, you will collect, manage, and convert raw data into usable information which will help us evaluate and optimise the organisation’s performance. You will develop, test, and maintain architectures for data processing and build pipelines for Extract, Transform, and Load (ETL) operations. It will be your responsibility to ensure data accuracy and enhance the quality of both existing and new data.

Your challenges
  • Identify and access data from multiple sources and ensure the absorption of existing and newly acquired data for future use.
  • Understand the organisational needs, create new data values to address those needs, convert data values into usable information, and design solutions.
  • Convert raw data into an easy-to-understand format for organisational use, such as analysis and reporting purposes.
  • Develop and maintain the organisation’s database which includes responsibilities such as design, process, analysis, and data flow optimisation.
  • Manage the pipelined architecture which includes responsibilities such as resolving logging errors, testing, administering databases, and ensuring a stable pipeline.

Requirements
  • Experience with data modelling techniques such as Kimball star schema, Anchor modelling, and Data vault
  • Competence in object-oriented or object function scripting languages such as Python
  • Quality experience in relational SQL and NoSQL databases, preferably with PostgreSQL, PITR, Pg_basebackup, WAL archival, and Replication
  • Proven skills in developing and maintaining ETL/ELT data pipelines and workflow management tools such as Airflow
  • Analytical skills with the ability to transform data into optimal business decisions
  • Experience in helping teams make informed business decisions with data
  • Strong communication and presentation skills
  • Fluency in spoken and written English

What’s Good To Have
  • Good background in cybersecurity and data protection
  • Proficiency in using data pipeline and workflow management tools such as Luigi
  • Knowledge in Java
  • Experience with Amazon Web Services (AWS) Cloud, Google Cloud Services (GCS) such as Google Compute Engine (GCE), BigQuery, Dataflow, and Cloud functions

Benefits
  • Market-based salary
  • Annual performance bonus
  • Health benefits
  • Casual dress code
  • Travel and internet allowances

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
C3 AI
September 28, 2022
C3 AI
September 28, 2022
C3 AI
September 28, 2022