Lead Data Engineer

Company:
Location: Arden Hills, MN

*** Mention DataYoshi when applying ***

Join boundless thinkers from all walks of life working together to feed the world. Take pride in working with powerful brands, including Purina Animal Nutrition, Nutra Blend, WinField United, Vermont Creamery, KozyShack, and more.


You'll join a global team committed to everything from the seeds that go into the ground, to the technology that improves crop yields and sustainability, to the nutrition that animals need, to marketing the final product.

Lead Data Engineer

Land O’Lakes is looking to add a Lead Data Engineer to our WinField United Demand Creation Information Technology (IT) product team. This role is critical to the success of leveraging data to optimize business outcomes. The lead data engineer will design solutions and guide a team of engineers to ensure accurate, consistent, reliable, and sustainable solutions are delivered. The lead data engineer will also influence data engineering processes that result in highly accurate and trusted data assets for use in WinField United and across the organization. Additionally, the lead data engineer influences stakeholders to ensure investments in data driven solutions are centered on optimizing clearly defined and quantified business outcomes.


Required Education and Experience

  • Bachelor's degree in Computer Science, MIS, or related field and 7+ years of experience with or Associates Degree and 9+ years of experience or high school diploma and 11+ years of experience. Applicable experience is characterized as advanced SQL, data engineering & data modeling techniques

  • 7+ years of experience building data integration solutions using tools like Informatica, Talend, Mulesoft, Qlik, etc.

  • Strong experience building out data warehouse and/or data lake

  • 3+ years of experience leading engineering resources

  • 2+ years of experience working with cloud-native data solutions on Microsoft Azure, AWS, or Google Cloud platform

  • Strong experience leading full lifecycle, large, complex reporting or data engineering efforts

  • Strong experience in working with heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using various data integration technologies (ETL/ELT, data replication/CDC, message-oriented data movement, API design, etc.)

  • Experience with DevOps, CI/CD pipelines and automated testing


Required Competencies/skills

  • Implement data structures using standards and best practices in data modeling, ETL/ELT processes, SQL, database, and other technologies

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

  • Thorough experience building and optimizing data pipelines and data sets.

  • Deep knowledge of the data vault method, model, and architecture

  • Ability to manage the overall data landscape: meta data, processing patterns, and data quality

  • A successful history of manipulating, processing and extracting value from large datasets.

  • Strong working knowledge of message queuing, stream processing, and highly scalable data stores

  • Ability to obtain a clear understanding of business needs and value, developing a detailed vision for the initiative, mapping out the solution, and guiding its implementation

  • Develop test-driven solutions that can be deployed quickly and in an automated fashion

  • Ability to obtain a clear understanding of business needs and value, developing a detailed vision for the initiative, mapping out the solution, and guiding its implementation

  • Demonstrated ability to collaborate across all levels (Engineers, Management, Architects, etc.) & across all skill sets (Data scientists, Data visualization developers, Salesforce developers etc.) particularly in a Product-oriented culture

  • Excellent communication, presentation, and documentation skills, including the ability to distill highly technical concepts and communicate them effectively to non-technical stakeholders

  • Ability to function as a technical lead, working closely with developers and data analysts, as well as hands-on implementation

  • Demonstrated ability to develop and maintain trusted-advisor status to technology and business stakeholders at various levels

  • Capable of using agile methodology and implementing Continuous Integration/Continuous Delivery (CI/CD) pipelines

  • Ability to proactively communicate and manage technology and projects risks

  • Troubleshooting skills, ability to determine impacts, ability to resolve complex issues, and ability to exercise sound judgment and initiative in challenging situations.


Preferred Education and Experience

  • Bachelor’s or Master’s degree in Business Analytics, Computer Science, Mathematics, Statistics, or a related discipline; or an equivalent combination of demonstrated skills and experience

  • Experience working with Databricks (or Spark) & Qlik technologies (Replicate and Compose)

  • Experience with any scripting languages, preferably Python

  • Experience with Snowflake

  • Experience working using agile methodologies

  • Experience in Manufacturing or Agriculture industry


Preferred Competencies/Skills

  • Ability to work closely with business partners to influence decisions on technology investments

  • Ability to provide technical leadership to evangelize best practices/proven patterns and promote the use of Corporate IT standards

  • Working knowledge of data privacy regulations including GDPR and CCPA as well as how to design solutions to drive compliance

*** Mention DataYoshi when applying ***

Offers you may like...

  • Principle HR

    Lead Data Analyst ETL/SQL
    Galway
  • SNAPHUNT PTE. LTD.

    Senior/Lead Data Analyst
    Clementi
  • Formative Search

    Lead Data Analyst
    Singapore
  • AutoScout24

    Teamlead Data Analyst (m/f/d)
    München
  • VirginPulse

    Lead Data Engineer
    Remote