Senior Data Engineer

Location: Boston, MA 02210

*** Mention DataYoshi when applying ***

The Senior Data Engineer at Author will have the chance to influence the decisions of a young organization. Software Engineers within the Author team are trusted to own major parts of the codebase and we expect you to thrive as a leader in defining and solving problems, with support from your peers.

Data Engineers at Author will help develop cloud native solutions, leveraging Google Cloud Platform APIs and Services and are able to code in languages and frameworks that fit into that ecosystem – Python, Golang, Node, and Java. In addition, our cloud native automation infrastructure allows you to quickly deploy and iterate on your code


We believe that both a high-quality team and high-quality code are critical pieces of our mission. You will be responsible to bring our engaging customer facing experience and features to life with APIs and data that power it. You will work collaboratively with architects and other engineers to recommend, prototype, build and debug data infrastructures on Google Cloud Platform (GCP). You will design data models, discover and ingest data, devise data pipelines and figure out the scaling strategy for our data stores.

Working closely with our product, data warehousing, and data engineering teams, you will focus on assembling large, complex sets of data, optimize data loading and distribute model outputs and metrics providing insights delivering engaging experiences on our consumer facing applications following an agile work environment governed through SAFe.

Required Qualifications:

  • 5+ years of Data Engineering or Database Development experience in at least one of the following databases PostgreSQL, MySQL, SQL Server, Oracle etc

  • Ability to build and optimize data sets, 'big data' data pipelines and related architectures

  • Ability to perform root cause analysis on external and internal processes supporting data movement

  • Familiar with and has experience with Writing/Troubleshooting/Debugging Python/Java code

  • Ability to build processes that support data transformation, workload management, data structures, dependency and metadata

  • Extensive experience with SQL and its various flavors (PL-SQL,T-SQL etc.)

  • One year of experience with at least one popular NoSQL data store - MongoDB, Firestore, DynamoDB, Cassandra etc

  • Excellent analytical skills associated with working on unstructured datasets

  • Good knowledge in Data modelling including but not limited to table, constraints and relationships but also optimum store and fetch of the data

  • Familiar with and has experience with Source Control/Git.

  • At least one year of experience working on any major cloud provider

  • Proven work experience as a Software Engineer and/or Database Developer

  • Hands-on experience building production-grade data solutions (relational and NoSQL)

  • Prior experience or ownership of the end-to-end data-engineering component of the solution

  • Proven experience optimizing existing pipelines and maintain all domain-related data pipelines

  • Analytical mind with problem-solving aptitude

  • Strong communication skills with ability to interact with business and customer representatives

  • Passion for growing your skills, tackling interesting work and challenging problems

  • BA/BS in Computer Science, Information Systems, Statistics or Computer Engineering

  • Experience working within an environment with a “startup” culture using agile, lean, DevOps, and DataOps delivery practices and methodologies.

Role Desirables:

  • Cloud certification on any major cloud provider

  • Good understanding of public cloud computing architectures and services. Experienced in the use of cloud native technologies, cloud cybersecurity, and implementation patterns to lower costs, improve

  • Experienced in designing, building, and testing complex scalable systems

  • Experience with Google Cloud Platform data stores like BigQuery, Bigtable, Cloud SQL and Firestore

  • Experience with Google Cloud Platform services like Dataflow, Data Fusion, Dataproc, Pubsub and Cloud Storage

  • Experience using big data tools like Kafka and Spark

  • Experience with workflow orchestration and pipeline tools such as Cloud composer and/or Airflow

  • Experience with Stream-processing systems

  • Experience with two or more object/function-oriented languages like Scala, C++, Java or Python

Scheduled Weekly Hours


*** Mention DataYoshi when applying ***

Offers you may like...

  • Nav

    Senior Data Analyst, Decision Intelligence (Remote...
    Atlanta, GA 30301
  • Datafin IT Recruitment

    Senior Data Analyst
    Cape Town, Western Cape
  • Kencko

    Senior Data Analyst
  • Wayfair

    Senior Data Analyst, Data Science Analytics
  • Volvo Group

    Senior Data Analyst – Quality & Customer Satisfact...
    Lyon (69)