Myticas Consulting

BHJOB15656_16809 - ETL Data Engineer

Job description

ETL Data Engineer

Our data engineering team is gjobing! We are looking for someone with a strong foundation in traditional approaches to data engineering as well as more recent experience in a modern, cloud - based, API - driven, data environment. This is an opportunity to join the Data Integration team where your primary focus is to design and build durable data pipelines.

You will be working on data integration processes to enable the gjobing Business Intelligence and Data Science teams. You will work on-premise while also helping migrate to the cloud and expand both structured and unstructured data capabilities with a focus on modernizing our data pipelines and integration strategy to increase data accuracy and availability, lower support cost, and increase speed to shelf.

KEY RESPONSIBILITIES

  • Build performant and scalable data pipelines
  • Create, monitor, and maintain data pipelines
  • Ensures new and existing code meets company standards for readability, testability, automation, documentation and performance
  • Automate when possible and where it makes sense
  • Ability to understand business processes and software systems and make technology decisions that make the most sense for the business challenges we are trying to solve
  • Pair with other team members to increase efficiency and collaboration
  • Continually bring features from conception to production, leveraging a DevOps mindset
  • Integrate with both existing legacy systems and new modern systems
  • Implement process improvements and create technical documentation where needed
  • Provide after hours on-call support on a rotational basis

KNOWLEDGE, SKILLS AND EXPERIENCE:

Required

  • Mid-Sr level proficiency with Python and SQL
  • Experience designing, modeling, and implementing snowflake, star, and relational data models.
  • Experience using ETL / ELT tools (e.g. SSIS, Informatica) and best practices
  • Ability to independently troubleshoot issues, think critically, and clearly communicate findings/recommendations
  • Automation and scripting skills
  • Experience using workflow management/orchestration tools (Airflow)
  • Experience working on stream-processing systems (Google Pub/Sub, Kafka, etc.)
  • We value education- whether that's in a formal university setting or knowledge you've gathered through your years of experience

Additional Nice To Have Skills:

  • Experience using CI / CD technologies(Concourse, Jenkins, Travis CI, etc.)
  • Experience building containerized applications (Kubernetes, Docker, etc.)
  • Experience with streaming analytics (e.g. Spark, Google Dataflow, Azure Streaming Analytics, etc.)
  • Knowledge of cloud databases (e.g. Azure Synapse, GCP BigQuery, Snowflake, etc.)
  • Experience ingesting data via API's
  • Experience with centralizing data from disparate sources, transforming data, and resolving data conflicts
  • Expertise in the Microsoft technologies ( SSIS / SSRS / SSAS / Azure Synapse ) and/or Google Cloud Platform technologies

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.