Job description

American Homes 4 Rent

As one of the country’s fastest-growing property management companies of single-family rental homes, American Homes 4 Rent has an exhilarating and fluid start-up culture and permanency of a well-founded corporation, rich with diversity. As pioneers in the industry, solidified by our place on Wall Street, American Homes 4 Rent (NYSE: AMH) is currently looking for qualified candidates. With a culture of unprecedented growth, quality and innovative collaboration, we are seeking personalities to complement our attributes.

The Senior Data Engineer will collaborate with Data Architects to oversee the departments day-to-day data integration work. Develops data models, maintains data warehouse and analytics environment. Writes scripts for data integration and analysis. Collaborates with key stakeholders of the Data Transformation, Data Governance, and Business Intelligence teams to define business requirements and objectives. Data mines and analyzes data, integrates data from a variety of sources, and deploys high-quality data pipelines in support of the organizations analytics needs. Creates and delivers data architecture and applications that enable reporting, analytics, data science, and data management and improves accessibility, efficiency, governance, processing, and quality of data. Designs and builds data provisioning workflows/pipelines, physical data schemas, extracts, data transformations, and data integrations and/or designs using ETL and microservices. Participates and leads peer development and code reviews with focus on test driven development and Continuous Integration and Continuous Development (CICD). Manages large projects or processes with limited oversight from manager. Coaches, reviews, and delegates work to lower-level professionals. Problems faced are difficult and often complex.

Responsibilities:

  • Designs and implements data management architecture to meet corporate data management needs and business functional requirements. Ensures that solution designs address operational requirements such as scalability, maintainability, extensibility, flexibility, and integrity.
  • Designs and builds data provisioning workflows/pipelines, physical data schemas, extracts, data transformations, and data integrations and/or designs using ETL and microservices.
  • Designs and develops programs and tools to support ingestion, curation, and provisioning of complex enterprise data to achieve analytics, reporting, and data science.
  • Leads peer development and code reviews with focus on test driven development and Continuous Integration and Continuous Development (CICD)
  • Monitors the system performance by performing regular tests, troubleshoots, and integrates new features.
  • Engages with cross functional teams on database integration efforts for merging BI platforms with enterprise systems and applications.

Requirements:

  • High School Diploma / GED required.
  • Bachelors degree in Computer Science, Information Systems, Business, Finance, Management Information Systems, Mathematics, Physics, Engineering, Statistics, Economics, and/or a related field preferred.
  • Minimum 8 years of experience in Database Architecture, Business Intelligence development, Data Engineer, Data Warehousing, and/or related.
  • Advanced methodologies, designs and processes in technical areas of ETL, ELT, and Data Modeling.
  • Advanced experience of consulting or client service delivery on Azure preferred.
  • Advanced experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL, and data warehouse solutions.
  • Advanced hands-on experience implementing data migration and data processing using Azure services: ADLS, Azure Data Factory, Azure Functions, Synapse/DW, Azure SQL DB, Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, HDInsight, Databricks Azure Data Catalog, Cosmo Db, ML Studio, AI/ML, etc.
  • Advanced experience in Azure and Big Data technologies such as PowerShell, C#, Java, Node.js, Python, SQL, ADLS/Blob, Apache Spark/SparkSQL, Databricks, Hive and streaming technologies such as Kafka, EventHub, NiFi etc.
  • Intermediate experience in DevOps and CI/CD deployments.
  • Intermediate experience in using Big Data File Formats and compression techniques and working with Developer tools such as Azure DevOps.
  • Advanced experience in RESTful APIs and messaging systems, and AWS or Microsoft Azure.
  • Intermediate experience with other BI tools such as Power BI, SSRS, SSAS and Tableau.

Work where you feel right at home –

If you are a versatile professional who values culture, a concerted environment and the potential for exponential growth, we want to work with you! Apply now and someone from our Talent Acquisition team will reach out to you soon!

Information regarding AH4R’s collection and use of your personal information can be found at https://www.ah4r.com/employeeprivacy

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.