Costco Wholesale

Data Engineer - Data Services

Job description

This is an environment unlike anything in the high-tech world and the secret of Costco’s success is its culture. The value Costco puts on its employees is well documented in articles from a variety of publishers including Bloomberg and Forbes. Our employees and our members come FIRST. Costco is well known for its generosity and community service and has won many awards for its philanthropy. The company joins with its employees to take an active role in volunteering by sponsoring many opportunities to help others. In 2018, Costco contributed over $39 million to organizations such as United Way and Children's Miracle Network Hospitals.

Costco IT is responsible for the technical future of Costco Wholesale, the second largest retailer in the world with wholesale operations in twelve countries. Despite our size and explosive international expansion, we continue to provide a family, employee centric atmosphere in which our employees thrive and succeed. As proof, Costco consistently ranks in the top five of Forbes “America’s Best Employers”.

The Data Engineer is responsible for developing data pipelines and/or data integrations for Costco’s enterprise certified data sets that are used for business critical data consumption use cases (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). At Costco, we are on a mission to significantly leverage data to provide better products and services for our members. This role is focused on data engineering to build and deliver automated data pipelines from a plethora of internal and external data sources. The Data Engineer will partner with product owners, engineering, and data platform teams to design, build, test, and automate data pipelines that are relied upon across the company as the single source of truth.

If you want to be a part of one of the BEST “to work for” companies in the world, simply apply and let your career be reimagined.

ROLE

  • Develops and operationalizes data pipelines to create enterprise certified data sets that are made available for consumption (BI, Advanced analytics, APIs/Services).
  • Works in tandem with Data Architects, Data Stewards, and Data Quality Engineers to design data pipelines and recommends ongoing optimization of data storage, data ingestion, data quality and orchestration.
  • Designs, develops, and implements ETL/ELT processes using Informatica Intelligent Cloud Services (IICS).
  • Uses Azure services such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Cosmos, Databricks, Delta-Lake to improve and speed up delivery of our data products and services.
  • Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
  • Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery.
  • Identifies ways to improve data reliability, efficiency and quality of data management.
  • Communicates technical concepts to non-technical audiences both in written and verbal form.
  • Performs peer reviews for other data engineers’ work.

REQUIRED

  • 5+ years’ experience engineering and operationalizing data pipelines with large and complex datasets.
  • 3+ years’ hands-on experience with Informatica PowerCenter.
  • 2+ years’ hands-on experience with Informatica IICS.
  • 3+ years’ experience working with Cloud technologies such as ADLS, Azure Databricks, Spark, Azure Synapse, Cosmos DB and other big data technologies.
  • Extensive experience working with various data sources (DB2, SQL,Oracle, flat files (csv, delimited), APIs, XML, JSON.
  • Advanced SQL skills required. Solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
  • 5+ years’ experience with Data Modeling, ETL, and Data Warehousing.
  • Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
  • Able to work in a fast-paced agile development environment.

Recommended

  • Azure Certifications.
  • Experience implementing data integration techniques such as event/message based integration (Kafka, Azure Event Hub), ETL.
  • Experience with Git / Azure DevOps.
  • BA/BS in Computer Science, Engineering, or equivalent software/services experience.
  • Experience delivering data solutions through agile software development methodologies.
  • Exposure to the retail industry.
  • Excellent verbal and written communication skills.
  • Experience working with SAP integration tools including BODS.
  • Experience with UC4 Job Scheduler.

Required Documents

  • Cover Letter
  • Resume

California applicants, please click here to review the Costco Applicant Privacy Notice.

Apart from any religious or disability considerations, open availability is needed to meet the needs of the business. If hired, you will be required to provide proof of authorization to work in the United States. Applicants and employees for this position will not be sponsored for work authorization, including, but not limited to H1-B visas.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.