Black Pen Recruitment

Senior Python Data Engineer (FinTech/ GCP/AWS - Un...

Job description

Senior Python Data Engineer (FinTech/ GCP/AWS - Unsecured loans)

Are you an experienced Senior Python Data Engineer in the FinTech space?

We have the perfect opportunity for you!

Our client, an AI-Driven lending platform is looking to hire a talented Senior Python Data Engineer to join their team. The team consists of innovators and builders with domain expertise in the lending space. They share a determination to continue leveraging data driven approaches to problem solving by building products that the market needs and API integrations that drive value across the board.


Job Type:
Remote | Full-Time


Requirements

  • 4+ years professional software or data engineering experience or a PhD in a relevant subject

  • Strong computer science background

  • Data oriented engineer, attentive to details

  • Extensive experience in Python

  • Experience with creating unit and integration tests

  • Experience with PySpark and working with big data

  • Experience with GCP and/or AWS

  • Experience with containerisation (e.g. Docker), and container orchestration (e.g. k8s, ECS)

  • Thorough knowledge of the Python data science/engineering ecosystem (e.g. Pandas, Numpy)

  • Experience with API development, ideally FastAPI or Starlette

  • Experience with setting up CI/CD pipelines, ideally in an ML environment

  • Experience optimising and preparing for production code initially developed by data scientists

  • Working experience with data visualization tools


Responsibilities:


  • Developing automated CI/CD pipelines for Machine Learning products

  • Productionising and scaling code created by Data Scientists

  • Developing fast and reliable APIs

  • Developing and deploying Docker containers in GCP

  • Building and configuring monitoring tools and dashboards

  • Championing software best practices, including mentoring junior engineers and data scientists

  • Ensure data quality and consistency

  • Develop all necessary automated reports for teams to perform and be the go-to person for any data question


Do we spark your interest? Send us your CV today!

We are looking forward to hearing from you!

4+ years professional software or data engineering experience or a PhD in a relevant subject Strong computer science background Data oriented engineer, attentive to details Extensive experience in Python Experience with creating unit and integration tests Experience with PySpark and working with big data Experience with GCP and/or AWS Experience with containerisation (e.g. Docker), and container orchestration (e.g. k8s, ECS) Thorough knowledge of the Python data science/engineering ecosystem (e.g. Pandas, Numpy) Experience with API development, ideally FastAPI or Starlette Experience with setting up CI/CD pipelines, ideally in an ML environment Experience optimising and preparing for production code initially developed by data scientists Working experience with data visualization tools Responsibilities: Developing automated CI/CD pipelines for Machine Learning products Productionising and scaling code created by Data Scientists Developing fast and reliable APIs Developing and deploying Docker containers in GCP Building and configuring monitoring tools and dashboards Championing software best practices, including mentoring junior engineers and data scientists Ensure data quality and consistency Develop all necessary automated reports for teams to perform and be the go-to person for any data question Do we spark your interest? Send us your CV today! We are looking forward to hearing from you!

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.