Hays

Big Data Engineer

Job description

Be a key driver on BASF’s path to digitalisation by supporting existing products and initiatives as well as innovate additional digital solutions that supports BASF’s global businesses

What you can expect
Our unit “Data Enablement - Big Data Solutions” uses the highly innovative technologies to develop advanced analytics prototypes, builds big storage solutions and analytic platforms for global deployment and organizes an overarching data lake, the BASF Enterprise Data Lake.

Tasks:
  • Run Data Ingestion and Advanced Data Screening
  • Design and build Data Flows within Big Data Architectures
  • Develop and optimize Data Models and pipelines for performance and scalability, reusable and listing in libraries for the future
  • Support industrialization of Analytics Solutions
  • Enable meaningful and insightful reports for Data Analysis and Monitoring
  • Ensure systematic quality assurance for the validation of accurate Data Processing
  • Building reusable code and libraries for future use
  • Optimization of applications for maximum speed and scalability
  • Implementation of security and data protection
  • Translation of stakeholder requirements into concrete specifications for the data warehouses, BI solutions and self-service solutions

Requirements:
  • A Bachelor or Master degree in relevant Business/IT studies with at least 5 years of experience in a similar role
  • Business Consulting and Technical Consulting skills
  • Working flexible and agile (Scrum knowledge appreciated) with a DevOps mindset
  • An entrepreneurial spirit and the ability to foster a positive and energized culture
  • A growth mindset with a curiosity to learn and improve.
  • Team player with strong interpersonal, written and verbal communication skills.
  • You can demonstrate fluent communication skills in English (spoken and written)
  • You are experienced with data visualization tools, like Tableau or Power BI
  • At least 2 years of experience in the field of Data Engineering, Big Data and Distributed Computing
  • Experience with Big Data technologies such as Hadoop, Spark, Hive Kafka
  • Experience in Cloud Big Data technologies and architectures within AZURE, Google Cloud or AWS.
  • Experience in Tools like Apache Nifi, Kylo or Streamsets
  • Experienced in Java, Scala, Python, MySQL

We offer:
A challenging area of responsibility with a high degree of personal responsibility. You will get opportunity to work on cutting edge technologies on exciting digitalization projects in the area of bigdata. You will be trained on the job in a dedicated, competent team.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
EXL Services
December 9, 2021
LINCOLN
December 9, 2021

Big Data Analyst