Hays

Machine Learning Engineer

Job description

Your new company

Welcome to Hapag-Lloyd, a leading global logistics company. As the fifth largest container liner shipping company in the world, we are here to make sure that the flow of goods never stops. We are an international team of 12,800 employees working across 400 offices in 128 countries.

The Knowledge Center, located in Gdańsk, will function as a hub for innovation and develop state-of-the-art business and technology solutions to help us navigate the future. And we want to do that together with you.

Our Mission - Your Chance

We are on the Mission of building a world class AI Team capable of supporting a world class shipping company like Hapag Lloyd to stay best in class with intelligent customer centric services.You are passionate about Big Data, AI and Machine Learning? Then come on board, cause we have tons of real business cases and data waiting for you to be brought to life.

Your new role

For our location in Gdansk we are looking for a Machine Learning Engineer - Knowledge Center

Responsibilities and Tasks:

INNOVATION

  • Design Data- and Model-Driven solutions for tasks that currently need human involvement or are too complex for standard engineering approaches.
  • You are orchestrating AI Technologies, Analysis Methods and Statistics to unveil the missing links in current processes and solutions and unchain new innovative solutions that can evolve from a simple prototype to full blown intelligent enterprise products.
  • You are analyzing big amounts of structured and unstructured data, augment and identify patterns applying state of the art Data-Mining methodologies, tools and libraries.
  • You are curiously evaluating and testing new Papers, Libraries, Third Party Solutions and vividly participate in AI Communities like Meetups, Conferences, Hackathons or Kaggle competitions, always looking for new arising opportunities with high innovation potential for Hapag Lloyd.


AI PRODUCT DEVELOPEMNT


  • You are a key player when new business case specific AI Modules are developed.
  • You analyze and understand the business case, processes and the available data structures
  • consult the product teams on the required training data
  • support the data engineers on engineering the data cleaning
  • lead the feature engineering process
  • support the data engineers in building the ETL pipeline
  • develop the model experiments and provide the production ready Model
  • conduct the operationalization of the model
  • participate in the implementation of the learning loop and automatic deployment evaluation of retrained models
  • participate in design and implementation of the model performance monitoring
  • Support product teams to handle AI Module related incident situations in a fast and solution oriented manner.


AI PLATFORM DEVELOPMENT


  • You are key player in building the generic, standardized and highly reusable platform of the Hapag Lloyd Data Science Development and Analysis Stack composed of Tools, Services and Modules, to enable the AI Team as well as Business- and System- Analysts to continuously improve time to market and cost efficiency of AI Solutions.
  • You are organizing trainings and information sessions for IT and Business Departments as well as for Public Community Events on various AI Topics to spread the knowledge and awareness about the possibilities, limits and future of AI in Logistic and IT.



What you'll need to succeed

A bachelor’s or master’s degree in computer science, business administration, mathematics, physics or other scientific area is preferred, but not required. Much more important is your experience and your attitude.Two to three years of relevant experience in enterprise level IT that equipped you to communicate effectively with the diverse stakeholders at corporate level is a good starting point.

TECHNICAL EXPERIENCE

  • You’ll need 1 – 2 years of development hands on experience, preferably with backend involvement. Experiences with Batch, Web service- or Test-driven development is a plus.
  • SQL, Python and related Development Stack elements including Git, common IDEs and ML frameworks are important assets for your endeavor. Java, JS or C++ would be helpful but optional.
  • Hands on experience with at least one relational DBMS like DB2, SQLite, PostgreSQL is important, while NoSQL or Distributed DBS is initially optional.
  • You should be confident working with various data formats from tabular data like CSV to some markup formats like HTML and common transfer formats like xml and json. Hands on experience with at least some MS-Office, image, audio or video formats is also important.
  • Cloud experience like AWS or Azure, … and related concepts and protocols of distributed computing would be helpful but are not a must.


DATA SCIENCE EXPERIENCE

  • You should be confident applying Linear Algebra (Vectors & Matrice operations), Statistics, Multivariate Calculus like Integration & Derivatives & Gradients & Optimization, and should be able to come up with numeric stable algorithms.
  • You’ll need relevant hands on experience in building ETL pipelines.
  • So you should have comprehensive experience in loading files or extracting data from databases. Being able to consume data streams or use REST APIs would be interesting but initially optional.
  • For ETL you’ll need to be an expert on data cleaning, evaluating basic data statistics, discretize, impute, encode categorical data and other transformation methods.
  • You confidently can analyze data to confirm that model assumptions are meat, for various model types.
  • You should have expert knowledge on every level of a ML pipeline: Data preparation, adequate visualization, model types, model test & evaluation as well as unsupervised learning methods.
  • You should bring expert know how in at least two of the following disciplines: NLP, CNNs, RNNs, GANs … and related ML frameworks like Sklearn, Keras, PyTorch or Tensorflow. More is better.
  • Being able to present and explain data with proper diagrams as well as experiences with current data science platforms like Anaconda, Dataiku, Rapidminer… is a must.

What you'll get in return

  • In a highly motivated Team of AI experts and thanks to the high impact, enterprise level business cases of Hapag Lloyd you have the chance to experience a quantum leap in your personal level of expertise and professional maturity.
  • Together we’ll take care of the right balance between, room for focus, creative innovation, getting things done, personal development, enough recreation and a relaxed, collaborative atmosphere.
  • As one of the 5 biggest carriers in the world, Hapag Lloyd is literally moving the world. Every optimization and automation is not only saving monetary costs but also reduces the ecological footprint. So you’ll have a big impact on making the world a better place to live.
  • With us your ideas, personality and skills have the freedom to evolve and make a difference. Hapag Lloyd is offering many different and challenging business areas like customer service, operational container steering, dangerous goods, maritime IT, … and is supporting you when ever you urge for new frontiers even at international level



What you need to do now
If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.