LEO Pharma

Senior Data Engineer

Job description

Role Description

Senior Data Engineer, R&D Data Science & AI, LEO Pharma, Ballerup, Denmark

LEO Pharma is on a journey towards next-generation innovation. Our vehicle for change is machine learning and artificial intelligence powered insights, based on data. This is where we need the help of a Senior Data Engineer.

The application of data science in life science is becoming increasingly critical for delivering new and transformative business insights and ultimately delivering faster and better medicines for patients. To deliver on those promises and to become truly data-driven the data need to be available thus, unlocking data silos, establishing seamless data flows, and providing reusable and curated data points will be some of the critical tasks for our future colleague. The ideal profile brings experience with ETL processes. The persistence of different data formats into different storage engines and ability to differentiate between low-latency and high throughput requirements, are examples of competences we look for in a candidate. The Senior Data Engineer will be involved in developing the Data Engineering backend for new and existing solutions and in the continuous development of the data engineering backend to ensure a sustainable and scalable technical solution framework to enable faster delivery of new data-driven business insights. The role will work in close collaboration with our data scientists and the team of software developers and cloud/data engineers building transformative and robust digital products for our Global R&D Organization.

Do you have experience with the newest and upcoming technologies, platforms and trends that can be exploited to create new business models or radically change current processes? Are you also looking for new challenges and are motivated by:

  • Playing a key role in a startup-type working environment in a data science development team where you will be working in tandem with data scientists and software/data engineers to identify and implement the optimal cloud-based solutions.
  • Having the primary ownership of the data engineering technology framework and for continuously optimizing performance and maintainability.
  • Design and development of operational data processing pipelines
  • Ensure application health and KPIs of deliveries for visibility at all time
  • Maintain a high standard of code quality based on robust design standards
  • Communicate verbally and orally in a timely manner findings and practices
  • Participate in all aspects of the software development life cycle in accordance with the Agile methodology

Then you are the Senior Data Engineer we are looking for!

Who are we?

You will join a company that has embarked on a very ambitious journey to become the world's preferred dermatology care partner, LEO Pharma is always looking at new and innovative solution to meet our very ambitious 2030 aspirations and that is why we need you.

Challenges ahead!

At LEO Pharma we have recently formed a new unit in Global Research & Development to bring data-driven processes into our critical business paths, building data systems and applications to support decision making in our current industrial processes and drive new business opportunities. We serve a two-fold purpose modernizing current drug development processes by focusing on more data-driven approaches and seeking out entirely new opportunities for business in dermatology using platform technologies which are data-driven at its core.

As our primary Senior Data Engineer, you will work in a cross-functional team built of subject matter experts and technical experts. The team will be responsible for building and delivering use cases to support a data driven R&D. As part of the unit, you and the team are responsible for building smart and sustainable data products in close collaboration with the end users. Interaction with other departments is very strong, using both shared projects and actual job rotations to foster cooperation and people development.

In projects you will both develop the data engineering backend and act in an advisory role, and you are expected to positively challenge the projects to continuously improve our solutions and ways of working. You should be able to synthesize ideas that are cross-disciplinary and strive for simplicity in solutions. You will become part of an open-minded high velocity team in an informal, pleasant, flexible, and social work environment. To be successful in the role it will be critical that you have the right knowledge and mindset to act and think entrepreneurial, and to accomplish this you will be supported by a very professional and highly skilled technical team.

We can offer you a job with lots of responsibilities, influence and more of all of it, if you have the talent and the ambitions.

Who are you?

You are an end-to-end problem solver who prefers to use the right tool for the job in close collaboration with your technical peers. You find it both motivating and interesting to optimize technical performance and maintainability using good practice methodologies, automation, and open-source toolkits, dealing with multiple data sources, and supplying end users with systems and applications running in production. You are familiar with different databases and know your way around exporting data out of legacy source systems. You will be working focused and agile on a project-by-project basis and you will bring a can-do attitude and contribute to strengthening our team spirit.

Preferred Qualifications & Experience:

  • Experience with a cloud vendor (AWS, Azure, GCP)
  • Experience with batch workloads
  • Experience with data pipelines
  • Good understanding of storage formats
  • Experience with the (Py)Spark API
  • Solid software engineering practices
  • Experience with deployment pipelines
  • Proficiency in Linux administration

The requirements listed above are considered as necessary to be successful in this position. The following qualifications/experience would be a plus:

  • Experience with Databricks and/or Snowflake
  • Experience with Azure Data Factory
  • Proficiency in Python
  • Experience with code instrumentation
  • Experience with data quality tools and practices
  • Interest in contributing to cloud/data science/backend part of the software products

Do you want to know more?

For further information, please contact Alex Schuleit at +45 3148 2519.

Applications will be evaluated on a running basis and candidates will be called in for interviews ongoing, so please apply as soon as possible, and no later than June 17th, 2022.

Let's pioneer together
At LEO Pharma, we help people with skin diseases live fulfilling lives by advancing dermatology beyond the skin. We drive dermatology with our knowledge, collaboration, and curiosity, and we are at the forefront of science in developing new medicines. For us, pioneering together is about constantly improving and extending what's possible for each other, our company, and our patients.

At LEO Pharma, we welcome and consider applications from all qualified candidates because we believe that our different perspectives, backgrounds, and attitudes are what enable us to make the best decisions for LEO Pharma and meet the needs of the wonderfully diverse marketplace we operate in.

For certain positions, LEO Pharma might complete a background check conducted by a third party.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.