City of Detroit

Senior Data Engineer

Job description

Description

LOCATION - 100% REMOTE - Sr Data Engineer

The City of Detroit is seeking a Senior Data Engineer. This role will provide the chance to design and build a data warehouse for the public sector. The small and collaborative nature of the team would allow for an individual to participate in all aspects of this project. The ARPA support coming to Detroit is a once in a generation opportunity for folks interested in public service.

As a Senior Data Engineer, you are responsible for supervising the team's ETL/ELT process. You will manage the data pipelines, written in Python and executed using a tool such as Airflow or Prefect, which extract data from departmental/vendor source systems into the data warehouse. These pipelines further transform the data into models which are usable by end users or user-facing systems. The Senior Data Engineer is responsible for integrating a diverse number of datasets to improve delivery to internal customers, and allow for data-led decision making.

You will be required to work with data stewards, typically in a city department or with a vendor, understand their source systems, and help solve the technical and organizational challenges that will arise as a part of the data integration process. The Senior Data Engineer also keeps the supervisor informed of progress and of potentially controversial matters.

Examples of Duties

What you'll do:

  • Build, design, and maintain an enterprise cloud-based data warehouse. This position would likely require an equal amount of meetings and heads down work time.
  • Act as a data engineering team lead, but also contribute directly to pipeline development and database modelling
  • Develop scripts to clean and integrate data
  • Design data schemas and database architecture
  • Develop work issues and delegating to other team members and contractors
  • Meet with team members and contractors to ensure alignment and buy in as well as technical feasibility.
  • Identify and implement ways to track and manage data quality
  • Develop new data sources with team members or departmental clients

Minimum Qualifications

  • Bachelors in computer science, data science, information management, or database administration.
  • 4 years of experience in systems analysis, computer science engineering, programming, information security management, data analysis, information science, or geographical information systems analysis is preferred but not required
  • 5+ years of experience with Data Warehousing: proven relational database design & build.

Minimum tech skills

  • You should be skilled in both Python and SQL (ideally PostgreSQL)
  • Experience with a pipeline/job scheduling/orchestration framework (Airflow, Prefect)
  • Experience integrating large disparate data sources in a variety of formats.
  • Experience working with REST APIs (typically as a source of data)
  • Experience with a cloud data warehouse such as Snowflake or BigQuery in the context of a large organization

Would be great if:

  • Experience with DBT (data build tool) to handle data modelling and transformations
  • Experience working with geospatial data, particularly in PostGIS or Esri/ArcGIS Online

Supplemental Information

Appointment term is based on availability of grant funds

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.