Data Engineer

Company:
Location: Remote

*** Mention DataYoshi when applying ***

We are hiring experienced senior-level data engineer who is versatile in skillset and are passionate about bringing value to our clients. We look for technology and data analytic leaders who are experts in their fields who also want to make a meaningful impact for our clients in the Bay Area.
The successful candidate will be a strong leader with the ability to execute specialized modern data solutions in the Snowflake Cloud data warehouse. They will possess skills in design, development, implementation, documentation, and management of data solutions, and be expected to clearly articulate and present technical information to clients and stakeholders.

Responsibilities:
  • Drive the design, building a metadata driven generic and scalable data pipeline framework for the Snowflake Cloud Data Warehouse;
  • Understanding of how to model data for scalability and optimized performance
  • Collaborate with business users to understand the need / problem
  • Collaborate with BI developers, Data Scientists, Product Managers, Software Engineers, Data Modelers and/or Platform owners to define the scope, design and implement the correct solution
  • Produces detailed and high-quality documentation detailing design and behaviors of data pipelines
  • Participate in architectural evolution of data engineering patterns, frameworks, systems, and platforms including defining best practices, standards, principles, and policies
  • Working knowledge of data quality approaches and techniques
  • Familiarity implementing Role Base Access Security (RBAC) for sensitive data
  • Bring a positive energy every day and work within a team to deliver the best possible solutions
  • Learn the business and the data that supports the business

Required Qualifications:
  • 5+ years experience in data integration and data warehousing (preferably Snowflake Cloud data warehouse)
  • 5+ years experience with relational database technologies (SQL Server, MySQL, Oracle, or similar)
  • 5+ years experience with ETL/ELT tools such as Fivetran, Talend, Matillion, Apache Airflow or similar
  • 3+ years experience Experience with custom ETL/ELT and programming/scripting language experience: Pyspark, Python, Scala (Python required)
  • Experience with developing dynamic data pipeline frameworks (with Fivetran, Talend, Matillion, Apache Airflow or similar)
  • Current experience with developing on Cloud Environments (AWS, Azure, GCP for instance)
  • Experience working in an Agile team and delivery capability
  • Experience performance tuning and optimization
  • BS Degree or MS Degree in Computer Science or equivalent area of study is required

*** Mention DataYoshi when applying ***

Offers you may like...

  • Experfy Inc

    Big Data Engineer
    Remote
  • EXOS

    Sr. Data Engineer
    Remote
  • Amgen

    Sr Data Engineer - (Remote U.S.)
    Los Angeles, CA
  • Mayo Clinic

    Senior Data Engineer
    Phoenix, AZ 85001
  • Loram

    Customer Support Data Engineer - LTI, Georgetown, ...
    Georgetown, TX 78626