Throughout our 125-year history, Roche has grown into one of the world’s largest biotech companies and a global supplier of transformative innovative solutions across major disease areas.
At Roche, we believe every employee makes a difference. We are passionate about transforming patients’ lives. We are confident in both decision and action, we believe that good business means a better world.
We are looking for an IT specialist to join one of our teams in the Roche Informatics division.
In Roche Informatics we focus on delivering technology that evolves the practice of medicine and helps patients live longer, better lives.
As a Data Engineer you will be accountable for developing data products with decentralized data architecture. You will be able to apply your data warehousing experience as well as learn modern and efficient ways of delivering data to business partners.
You will be a part of the global team working on multiple, cutting edge technologies (such as AWS, Snowflake, DataOps, dbt, Talend, to develop solutions for Data Mesh.
Your key responsibilities:
You will be developing data products in Snowflake deployed on an AWS cloud platform
You will be using DataOps.live to develop, manage and deploy data vault and developing transformation workflows with dbt
You will ensure data product characteristics are met with QA tests, observability tool Monte Carlo (SLO, SLI), data catalog/ governance tool Collibra, and security tool IMMUTA
You will be responsible for releasing new versions of data products including writing release notes and following change request process and automating it
You will work on tasks in Jira following agile processes of estimation, sprint planning and sprint execution. You will document results of your work in Confluence
You will take part in learning and knowledge sharing activities organized by data product, platform and infrastructure teams where you will be able to learn modern technologies, methods and share your experiences
Your qualifications and experience:
Experience in working in a data engineering team responsible for building data warehouses and ETL
Experience in Snowflake or similar technologies (building data warehouses using MS SQL Server, Oracle, AWS Redshift etc.) and knowledge of SQL
Experience in DataOps.live and dbt or similar technologies (SSIS, Informatica, Apache Airflow etc.)
Knowledge of data modeling methodologies - data vault, data products, and data mesh preferred, data warehouse experience and Kimball methodology is also good
Basic knowledge of Git and modern ways of working with code
Nice to have - experience with Collibra, Monte Carlo, IMMUTA, and Python
Who we are
At Roche, more than 100,000 people across 100 countries are pushing back the frontiers of healthcare. Working together, we’ve become one of the world’s leading research-focused healthcare groups. Our success is built on innovation, curiosity and diversity.