Data Engineer II, Infrastructure- Dallas, Austin, ...

Job description

H-E-B is one of the largest, independently owned food retailers in the nation operating over 420+ stores throughout Texas and Mexico, with annual sales generating over $34 billion. Described by industry experts as a daring innovator and smart competitor, H-E-B has led the way with creative new concepts, outstanding service and a commitment to diversity in our workforce, workplace and marketplace. H-E-B offers a wealth of career opportunities to our 145,000+ Partners (employees), competitive compensation and benefits program and comprehensive training that lead to successful careers.


Since H-E-B Digital Technology's inception, we've been investing heavily in our customers' digital experience, reinventing how they find inspiration from food, how they make food decisions, and how they ultimately get food into their homes. This is an exciting time to join H-E-B Digital--we're using the best available technologies to deliver modern, engaging, reliable, and scalable experiences to meet the needs of our growing audience. If you enjoy taking on new challenges, working in a rapidly changing environment, learning new skills, and applying it all to solve large and impactful business problems, we want you as part of our team.

Our Partners thrive The H-E-B Way. In the Data Engineer II, Infrastructure job, that means you have a...

HEART FOR PEOPLE... you're willing to provide support to junior developers

HEAD FOR BUSINESS... you have the skills to effectively deliver code solutions / features

PASSION FOR RESULTS... you can produce quality results with little direct supervision

What you will do:
  • Assist in the design and deployment batch and streaming data pipeline infrastructure using IaC and CI/CD
  • Implement features to ensure data platform performance, reliability, and security
  • Develop solutions to improve monitoring and observability for data pipelines and platform infrastructure
  • Participate in the building of data platform components using hybrid cloud services (AWS, GCP, and Azure)
  • Use configuration management tools to provision system images and install and configure Linux application servers
  • Help contain costs by delivering solutions to monitor data platform utilization and expenditure

Project you will impact:
  • Build a world class data platform that can handle petabytes of data
  • Improve the data quality and consumer experience for 100K+ enterprise data consumers

Who you are:
  • At least 3+ years of hands-on experience in DevOps for cloud infrastructure and data pipelines
  • Strong background in Linux, networking, SSL/TLS cert management, secrets management, IAM and security best practices
  • 3+ years experience programming in one or more languages such as Bash/Shell, Python, Java, Go, Ruby
  • Knowledge of Big Data and Hybrid Cloud infrastructure. Experienced in technologies such as Kafka, Kubernetes, Spark, Databricks, AWS EMR, S3, data warehouses (Snowflake, Teradata)
  • Experience with one or more cloud infrastructure providers (AWS, GCP, Azure)
  • Experienced in cloud administration and Infrastructure as Code (Terraform, Cloud Formation, AWS CDK, Pulumi)
  • Comfortable with configuration management tools (Ansible, Puppet, Chef, Salt)
  • Exposure and working knowledge of monitoring, APM, and log analysis tools like Datadog, Splunk, ELK Stack, New Relic
  • Experienced with CI/CD tools such as GitLab CI/CD and Jenkins
  • Up to date on the latest technology developments. Should be able to evaluate and propose new tooling/solutions for data platform
  • Excellent written, oral communication and presentation skills

  • DevOps certifications
  • Cloud certifications

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.