Job description

Key responsibilities and accountabilities

  • Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset.
  • Working on a range of projects including batch pipelines, data modeling, and data mart solutions you’ll be part of collaborative project teams working to implement robust data collection and processing pipelines to meet specific business need.

Goals

  • Design, build, and measure complex ELT jobs to process disparate data sources and form a high integrity, high quality, clean data asset.
  • Executes and provides feedback for data modeling policies, procedure, processes, and standards.
  • Assists with capturing and documenting system flow and other pertinent technical information about data, database design, and systems.
  • Develop data quality standards and tools for ensuring accuracy.
  • Work across departments to understand new data patterns.
  • Translate high-level business requirements into technical specs.

Knowledge, Skills And Abilities

Education & Experience

  • Required:
  • Bachelor’s degree in computer science or engineering.
  • 5+ years of experience with data analytics, data modeling, and database design.
  • 3+ years of coding and scripting (Python, Java, Scala) and design experience.
  • 3+ years of experience with Spark framework.
  • Experience with ELT methodologies and tools.
  • Experience with Vertica OR Teradata
  • Expertise in tuning and troubleshooting SQL.
  • Strong data integrity, analytical and multitasking skills.
  • Excellent communication, problem solving, organizational and analytical skills.
  • Able to work independently.

Additional / Preferred Skills

  • Familiar with agile project delivery process.
  • Knowledge of SQL and use in data access and analysis.
  • Experience with Airflow.
  • Ability to manage diverse projects impacting multiple roles and processes.
  • Able to troubleshoot problem areas and identify data gaps and issues.
  • Ability to adapt to fast changing environment.
  • Experience with Python.
  • Basic knowledge of database technologies (Vertica, Redshift, etc.).
  • Experience designing and implementing automated ETL processes.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
TUI
February 27, 2024

Fleet Data Engineer

Apotek Hjärtat
February 27, 2024
Capgemini Engineering
February 27, 2024

Experienced Data Engineer & Team Lead