Job description

Be Part of Our Success Story

We are looking for a passionate and talented MLOps / Data Engineer (m/f/d) to support our machine learning team and drive the development of next generation ML security products. Integrated in a young, motivated and fast growing team of data scientists and research engineers, you will take machine learning projects from research stages to production and deploy data science solutions into the company’s flagship products.

Your future role
Lead design and implementation of automated pipelines that run, monitor and retrain AI/ML models
Enhance and improve life cycle management of AI models (e.g. new releases, monitoring, retraining, and troubleshooting)
Work together with data scientists to transform data science prototypes from the research stage to production-ready ML services
Build a set of reusable tools to support the machine learning research and development process
Create data pipelines to automate the process of data processing and data labeling
Deploy machine learning models using container technologies on a cloud infrastructure
Monitor and maintain the portfolio of operational machine learning models and components
Support technology adoption and integration of data science solutions into the company’s flagship products
You offer
Completed studies in natural sciences or engineering preferably with a background in computer science
Ability to write robust code in Python, preferably 3 years of experience as a Python developer
Profound knowledge of best-practices in software development
Practical experience with Big Data platforms like Databricks/Spark and container technologies (e.g. Docker, Docker Swarm)
Experience using MLOps methodologies and deployment tools (e.g. MLflow, BentoML, Kubeflow, Seldon Core, …)
Knowledge and experience with DevOps practices (e.g. test automation, deployment automation) and CI/CD tools (e.g. GitLab)
Willingness to familiarize oneself with new subject areas
Ability to work in a team
Fluent English skills, German is of advantage
Nice to have
Familiarity with Java, JavaScript, Ruby or Go
Knowledge of AWS cloud technologies and experience with the SageMaker machine learning platform
Familiarity with machine learning, deep learning and data mining
Practical experience with machine learning frameworks (like Tensorflow or PyTorch) and libraries (like scikit-learn)
Familiarity with monitoring tools (e.g. Elastic Stack, Grafana, Prometheus, Graylog, …)
Knowledge in network security
What we offer
A dynamic and international working environment
Pleasant working atmosphere in a young and highly motivated team
Flexible working hours
Modern office environment with a central location in Vienna
Appropriate and performance-related remuneration (monthly gross salary according to the collective agreement for employees of companies in the field of automatic data processing and information technology services of at least € 3,501.00 for the ST1 control level activity family). Of course, there is the possibility of overpayment depending on experience and qualifications.

If you think we should definitely get to know you, please send your CV including a letter of motivation and references to the e-mail address below:
By sending your application to you accept our Privacy Policy.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.