Senior BI Data Engineer

Company:
Location: Remote

*** Mention DataYoshi when applying ***

Senior BI Data Engineer


About this opportunity:

As the Senior Business Intelligence Data Engineer you will build the infrastructure required for optimal extraction, loading and transformation of data from wide variety of data sources using Microsoft Azure Cloud services.

This is position can be done remotely from anywhere in the U.S.


Your key responsibilities:

  • Work closely with stakeholders and IT development teams to build modern and highly scalable cloud data platform that enables data ingestion, storage, transformations, and preparation of massive datasets for data analytics and machine learning models.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
  • Design and develop optimal data pipeline architecture and infrastructure for data movement and data orchestration aligned with trending data pipeline design patterns in the industry and best practices.
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Perform proof of concepts for innovation and to continuously improve and enhance the capabilities of the business intelligence platform in cloud and scale them out for production use.
  • Contribute ongoing monitoring including cloud resource capacity management, performance monitoring, troubleshooting, and resolving technical issues.
  • Take ownership of the data platform to ensure compliance with regional data and information governance policies and information security management requirement. Keep the company data secure across national boundaries.
  • Collaborate with the business and technical teams to ensure active support and resolution of risk and incidents.
  • Take ownership of the code artifacts, continuous integration, and continuous deployment processes of the platform.
  • Identify, design, and implement internal process improvements by automating manual processes, and re-designing infrastructure for scalability


Minimum Qualifications:

  • Bachelor’s degree in Computer Science, Computer Engineering or relevant field
  • 5+ years of experience in building and maintaining optimized and highly available data pipelines
  • Experienced in designing and developing data movement and orchestration pipelines in batch and real-time data integration using Microsoft Azure data services (Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure Data Synapse, Databricks, Key Vault, Events Hub)
  • Experience working with Microsoft business intelligence suite (SSMS, SSIS)
  • Experience working in a DevOps environment. CI/CD, Azure DevOps, Test automation, Docker containers
  • Automate azure resource provisioning and scripting using Azure CLI, ARM templates, Terraform, Biceps and PowerShell.
  • Excellent working knowledge in writing scripts in Python
  • Strong concepts of working with data warehouse model
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • Worked on Agile methodology with an experience in writing stories, work-breakdown-structure, estimating story points and delivering in sprints.
  • Excellent communication skills, both written and verbal.
  • Ability to interact in a professional manner and build relationships with a broad range of people.


Preferred Qualifications:

  • Ability to work without direct supervision
  • Experience with data modelling and delivering data warehouses
  • Proven capability in data analysis and presenting insights visually
  • Ability to write, collate and present reports
  • Automate azure resources using Azure CLI, ARM templates and PowerShell
  • Demonstrated experience with SQL Server (SSMS/SSIS)
  • Excellent working knowledge in writing scripts in Python
  • Experience working with MPP data warehouses (Azure Data Synapse)
  • Experience working with distributed computing frameworks (Databricks, Dela lake, Lakehouse)
  • Experience working with data replication tools and technologies (SQL replication, etc)
  • Excellent understanding of IT Infrastructure and Networking in Azure platform
  • Excellent knowledge of message queuing, stream processing and real-time data ingestion service (Events Hub)


About GenesisCare:

Across the world, GenesisCare has more than 440 centers offering the latest treatments and technologies that have been proven to help patients achieve the best possible outcomes. That includes 300 centers in the US as well as 14 centers in the U.K., 21 in Spain and 36 in Australia. We also offer urology and pulmonology care in the U.S. through our integrated medical offices. Every year our team of more than 5,000 employees see more than 400,000 people globally.

Our purpose is to design care experiences that get the best possible life outcomes. Our goal is to deliver exceptional treatment and care in a way that enhances every aspect of a person’s cancer journey.

Joining the GenesisCare team means a commitment to seeing and doing things differently. People centricity is at the heart of what we do—whether that person is a patient, a referring doctor, a partner or someone in our team. We aim to build a culture of ‘care’ that is patient focused and performance driven.

GenesisCare is an Equal Opportunity Employer that is committed to diversity and inclusion.

#LI-REMOTE

*** Mention DataYoshi when applying ***

Offers you may like...

  • BairesDev

    Senior Big Data Engineer |LATAM|
    São Paulo, SP
  • BairesDev

    Senior Big Data Engineer |LATAM|
    Colon, PA
  • Credit Suisse

    Senior Big Data Engineer
    Wrocław, dolnośląskie
  • BairesDev

    Senior Big Data Engineer |LATAM|
    Brasília, DF
  • Talent Insights Group

    Senior Big Data Engineer - Location Analytics
    Sydney NSW