Job description

Responsibilities

  • Utilize machine learning and AI techniques to analyze logistics data, focusing on warehouse and transportation metrics.
  • Develop algorithms and models to parse and process large volumes of active incoming field data.
  • Implement systems to ingest data from MongoDB, AWS Time Stream DB, and/or Redis DB during the backend ingestion process.
  • Design and execute machine learning strategies, including Deep Q learning, to improve process efficiency in logistics operations.
  • Collaborate with cross-functional teams to identify opportunities for process improvement and implement data-driven solutions.
  • Stay updated with the latest advancements in machine learning, AI, and related fields to enhance analytical capabilities.

Requirements

  • Bachelor's or Master's degree in Computer Science, Engineering, Mathematics, or a related field.
  • Strong proficiency in algorithms, data structures, and machine learning techniques.
  • Prior experience in performing data analysis and machine learning on logistics data is highly desirable.
  • Proficiency in programming languages such as Python, Java, or C++.
  • Experience with data storage systems like MongoDB, AWS Time Stream DB, and Redis DB.
  • Excellent problem-solving skills and ability to work with large datasets.
  • Strong communication and collaboration skills to work effectively in a team environment.

Additional Skills (Preferred)

  • Familiarity with Deep Q learning and other advanced machine learning strategies.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Knowledge of software development methodologies and practices.

This job was posted by Alisha from Refactor Academy.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.