At Entera, we are on a mission to transform the way investors find and buy properties. Powered by machine-learning, Entera's end-to-end residential real estate platform modernizes the real estate buying process. Entera's property source aggregation platform, discovery algorithms, intelligent tools and expert real estate service team help our clients access and evaluate more properties, make data-driven investment decisions, and win more - 100% online.
As a Data Engineer , you'll contribute to our best-in-class data pipeline and data-driven culture. You'll work with multi-discipline experts with hard-science backgrounds in a tight knit team to deliver on our efforts around data curation and management. You'll work with modern ETL frameworks to prepare data for exposure to both our internal business users and customers via BI tools, internal APIs, and custom built services. Within our team, you'll be able to further develop your skills and work with a team of experts to deliver on massive improvements to our data pipeline and associated systems.
Successful candidates will thrive in Entera's unique operating environment and culture: high-growth, innovative, lean, and values-driven. As such, successful candidates must be highly capable in each of the following dimensions (among others): resourcefulness, adaptability, curiosity, analytical thinking/problem solving, proactivity, collaboration, technological savvy, and operating in a dynamic environment.
What You'll Do:
Use Python and SQL to improve upon a best-in-class data pipeline and develop our workflows
Contribute to cloud-first services that support our analysis, reporting, and metrics collection efforts
Use agile software development processes to iteratively make improvements to our data systems
Support development processes with maintenance of CI/CD pipelines
Deliver on detailed specifications for business intelligence and reporting needs
Work with product and engineering in cross-functional teams to deliver on improvements to our systems
Craft and maintain data workflows in our ETL infrastructure
Construct infrastructure for introducing new datasets into our data warehouse (Snowflake)
Develop software to make it easier to build such workflows (like, adding new datasets in the future)
Synchronize data from our warehouse to other services (internal APIs, third-party SaaS tooling)
Maintain ETL software dependencies in Docker
Manage configuration and access to our data-related cloud resources and data warehouse using Terraform
Contribute to and further develop our data-driven culture
Who You Are:
MS or PhD in Computer Science, Mathematics, Statistics, Physics, Economics, or similar hard-science
3+ years hands-on experience in Data Engineering at growing product-driven tech companies
Proficiency in AWS cloud services
Advanced capabilities in Python and SQL
Production experience with Airflow, Prefect, or similar workflow orchestration frameworks
Experience with Snowflake
Software development background (familiarity with version control systems, CI/CD, testing, system design)
Strong analytical and problem solving skills
Nice to have:
Understanding of dbt or similar data transformation frameworks
Understanding of Spark
Entera is proud to be an equal opportunity employer (EEO) that celebrates difference and diversity. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. If there are preparations we can make to help ensure you have a comfortable and positive interview experience, please let us know.