Senior Data Engineer

Company:
Location: Remote

*** Mention DataYoshi when applying ***

This position is eligible for 100% remote work, within the U.S!

Our Projects:

Our clients range from start-ups to large government customers. We work hard to deliver web apps with features our customers need to run their business. Depending on the client, we may write an app for a minimally viable product, focused around a handful of features, or a suite of microservices where each plays its part in a larger, event-driven system. We work with our clients to release the most important features first, and we are proactive understanding business goals and finding better ways to reach them with technology.

Our approach is simple: Look at the problems that are in front of you and pick the simplest architecture and implementation to solve them. It might take a little extra time to do things right, but it’s worth it.

Why We Need You:

Netrist needs your help in developing Instant KPIs, a product that helps businesses stay in control of operations by actively measuring key performance indicators (success metrics). Instant KPIs alerts customers when changes in operations will impact metrics, either positively or negatively. Our vision is that managers can stay on top of operational performance by managing problems before they impact the bottom line.

The Senior Data Engineer position is open to U.S. citizens residing anywhere in the United States (not abroad). This is an ideal position for a Senior Data Engineer with Apache Airflow experience who either wants to engineer a Software-as-a-Service product from the “ground up”, or someone who wants to broaden their Data Engineering experience through the entire lifecycle of data through a product.

What You Will Do:

You will be responsible for creating common data models that Instant KPIs will use to measure operations and calculate metrics. Working with existing cloud-based Artificial Intelligence services and other time-based event analysis software, you will forecast changes in metrics and generate alerts when changes violate constraints. You will also be responsible for collecting data from a wide range of e-commerce, logistics, and shipping systems—applying common patterns for collecting and transforming data into common data models. You will help expose metrics and predictive data through APIs, contributing to the API design and ensuring data powering the API is readily available to the microservices that expose it. With the help of DevOps Engineers. you will build software pipelines that automatically deploy new code without disruption of service.

Day-to-day, you will work collaboratively with software engineers, DevOps, front-end engineers, mobile app developers, enterprise sales, and marketing. Using the Agile development process, you will iteratively create, market, and enhance Instant KPIs. The team will launch a minimally viable product, iteratively add features, listen to customer feedback, and prioritize features based on customer and market needs.

What we like to see:

  • Proven experience building, operating, and maintaining fault-tolerant and scalable data processing with Apache Airflow or related technologies
  • Ability to understand and support the business side: why this product is useful and how technology can be applied to add features that make it more valuable.
  • Experience with Artificial Intelligence services in AWS or Azure
  • Experience with time-series analysis of data (knowledge of Drools Fusion a plus)
  • Experience coding in one or more programming languages (Python, Java, NodeJS, etc.)
  • In-depth knowledge of at least one Database system (Relational or Document DBs)
  • Experience with Real-time messaging solutions
  • Excellent communication skills (written and oral)
  • Pragmatic problem-solving abilities

You should be familiar with:

  • Agile development methodology
  • Docker containerization & Kubernetes
  • Automated GitOps-based deployments
  • Data caching (Redis, Memcached, etc.)

Qualifications:

  • Bachelor’s in Computer Science, Data Science, or another comparable field
  • At least 7 – 10 years of experience in information technology
  • At least 2 years of experience in work directly related to data engineering
  • Must be a U.S. citizen and reside in the U.S.

Our Benefits:

  • Competitive Salary
  • 100% Paid Medical, Dental, Vision, Life & Disability
  • Remote work schedule
  • Education Assistance
  • Health Reimbursement Arrangement
  • Paid Holidays and Vacation
  • Company Retirement Matching

protected
false

*** Mention DataYoshi when applying ***

Offers you may like...

  • Farfetch

    Senior Data Analyst
    Porto
  • Bank of Ireland

    Senior Data Analyst
    Dublin, County Dublin
  • Reperio Human Capital

    Senior Data Analyst
    Dublin, County Dublin
  • 99

    Senior Data Analyst (Safety)
    Brasil
  • AON SINGAPORE CENTER FOR INNOVATION, STRATEGY AND MANAGEMENT PTE. LTD.

    Senior Data Analyst
    Outram