Turing

Python Data Engineer

Job description

A U.S.-based company that is offering residential real estate investors attractive investment property loan services is looking for a Python Data Engineer. The selected candidate will be responsible for creating and implementing complicated data solutions that satisfy business requirements. The company's streamlined application procedure provides real estate intelligence services in a way that is easy, dependable, and quick to help clients get financing for their future investment property. The company has managed to secure more than $156mn in funding so far.


Job Responsibilities:


  • Create and build data pipelines that convert data into a format that is usable for consumption further down the line
  • Identify opportunities for automation and process improvement in close collaboration with other teams
  • Take part in code reviews to guarantee good code and adherence to best practices
  • Utilize Python and related technologies to work with a range of data types, such as structured, unstructured, and semi-structured data
  • Make use of cross-functional teams, such as Data Science and DevOps, to enable effective data integration
  • Guarantee that data processing systems run successfully and efficiently by continuously monitoring and improving them
  • Continuously document and communicate data engineering solutions and processes to relevant stakeholders


Job Requirements:


  • Bachelor’s/Master’s degree in Engineering, Computer Science (or equivalent experience)
  • At least 3+ years of relevant experience as a Data Engineer
  • 3+ years of experience designing and developing data engineering solutions using
  • Python
  • Prolific experience working with big data technologies like Hadoop, Spark, and Hive
  • Extensive experience working with SQL and NoSQL databases
  • Prior experience designing, building, and maintaining data pipelines
  • Demonstrable experience working with Airflow or similar workflow orchestrators
  • Solid understanding of AWS, Azure, or GCP
  • Thorough understanding of data structures, algorithms, and software design principles
  • Familiarity with Databricks and PySpark is desirable
  • Nice to have some experience working with containerization tools like Docker or Kubernetes
  • Excellent problem-solving abilities and meticulousness
  • Strong verbal and written communication abilities
  • Ability to work in a dynamic, team atmosphere
  • Excellent written and verbal English communication skills

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.