Data Engineer with GCP, Telemedicine Technology So...

Job description

    New York

Big team (20+ people)

We help you grow in your profession


The client is a telemedicine company. A comprehensive digital solution developed by their technicians enables new models of patient care. The basic principle of operation is patient-doctor communication via secure video channels. The company provides emergency teleconferencing assistance to patients in more than 40 states, and powers telehealth solutions for over 240 health systems comprised of 2,000 hospitals and 55 health plan partners.
We invite you to our company, not a project


DataArt is assembling a team for an incipient project with a modern stack and microservices architecture. The service runs on both Web and Mobile applications. The client is looking to architect and build a complex platform to support multiple telehealth solutions. This is a unique situation when modernization is practically a green field with an opportunity to design things right.
We create a restriction-free environment
We let you concentrate on your job


We are going to launch a few SCRUM teams, which will consist of high-quality backend developers (Java, Node.js), frontend developers (the latest version of Angular), QA Automation engineers (Java, JavaScript), DevOps engineers (AWS), and Product owners and UX/UI designers. Overall team expertise is 90% senior level. The plan is to build a highly professional and performing team of rock stars who can add value to the Product and create a state-of-the-art telehealth solution.
By the way, do you speak English?


Develop and support data analytics platforms used internally by different healthcare facilities. Adopt new features of the serverless data lake framework and expand the existing ones based on cutting-edge technologies using GCP. Data lake consumers are different departments which use the pull of data for making a different type of predictions and decisions on improving care giving, care statistics, sales and many more. Also extend and support the ETL ingestion pipelines implemented with Informatica.


Python, SQL, GCP (Pub/Sub, Cloud Storage, BigQuery, Airflow), Looker, Terraform, Gitlab.


  • Development and support of the core data analytics platform (Python)
  • Thorough analysis of the client's business logic
  • Collaborate with multiple teams
  • Understand the overall enterprise Data Architecture and execute on delivering a Next Gen Data Warehouse
  • Work with infrastructure team on release process and production troubleshooting


  • Knowledge of the general principles of Data Warehousing and Data Lakes
  • Wiliness to do research on business logic
  • Hands-on experience with SQL
  • Knowledge of Python
  • Experience with GCP or other cloud providers
  • Experience working with CI/CD pipelines and releases
  • Experience in troubleshooting

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.