Data Engineer

Company:
Location: Scotts Valley, CA 95066

*** Mention DataYoshi when applying ***

As a Data Engineer at Universal Audio, you’ll be a part of a high impact Data & Analytics team that empowers data-driven decision making throughout the company.

You will be a part of a small team tasked to design and build highly scalable modern cloud-based data solutions that help improve the customer experience and drive value for the business. You will work directly with business functions such as sales, marketing, finance and product management to translate business requirements into data engineering solutions.

Working with a variety of data warehousing platforms like Redshift and core pipelining languages like Python, you will construct and manage the data pipelines and data modeling layer on which business-facing analytics solutions will be deployed.

Key Responsibilities:
  • Deliver solutions to support enterprise information management, business intelligence, machine learning, data science, and other business interests
  • Build and support data pipelines to optimize data retrieval from various data sources and build data engineering assets to surface actionable business insights
  • Deliver well-defined, transformed, tested, documented, and code-reviewed data sets
  • Design, implement, and document data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL databases
  • Identify ways to improve data reliability, efficiency and quality
Requirements:
  • Technical expertise with an emphasis on data warehouse solutions, business intelligence, big data analytics, enterprise-scale custom data products
  • Strong scripting knowledge in a modern object-oriented data pipelining language such as Python
  • Experience building and managing ETL/ELT pipelines from inception to production
  • Expertise in writing complex SQL queries and understand the methodologies to tune/improve query performance
  • Experience with data transformation frameworks such as DBT, LookML, Matillion and Airflow
  • Proficient in data modeling and developing SQL database solutions (PostgreSQL, MySQL)
  • Knowledge of data modeling techniques for building and managing both physical and logical models
  • Experienced in infrastructure automation and cloud platforms: AWS, Azure, or GCP
  • Experience with workflow orchestration platforms such as Airflow, Glue and Dataflow
  • Understanding of different types of storage (filesystem, relation, MPP, NoSQL) and working with various kinds of data (structured, unstructured, metrics, logs, etc.)
  • Experience with code management tools (e.g. Git) and DevOps tools (e.g. Docker, Bamboo, Jenkins)
  • Understanding of agile project approaches and methodologies
  • Understanding of basic testing types
  • Knowledge of data visualization tools such as Tableau or Looker
  • Familiarity or desire to learn quantitative analysis techniques (e.g., predictive modeling, machine learning, segmentation, optimization, clustering, regression)
  • Capability to conduct performance analysis, troubleshooting and remediation
Experience and Education
  • Bachelor’s Degree in quantitative field
  • 4+ years of experience designing reporting data structures for analytics
  • 4+ years of experience with SQL databases and advanced SQL & PostgreSQL coding

*** Mention DataYoshi when applying ***

Offers you may like...

  • Signpost

    Data Engineer
    Colorado
  • Fivesky

    Python Developer, Data Engineer
    New York, NY
  • Business Transformation Institute

    P1-042621-3 Data Engineer (SE6)
    Columbia, MD 21044
  • Businessolver

    Data Engineer
    Illinois
  • Piper Companies

    Data Engineer (SQL/Snowflake)
    Remote