Data and Analytics is foundational to our Petcare OGSM and will drive our transformation to a business that is powered by data.
To deliver on this ambition set by this OGSM we require the very highest level of technical / engineering expertise within Global Petcare Data & Analytics
There is a need to expand this high performing team of creative, skilled individuals – to help build out new capabilities and bring fresh ideas to the table
For this role, we hope you have the following skills we require to round out our team:
An inspirational enthusiasm for tech and innovation
Software development practices
Knowledge and experience with containerization and cloud microservices architecture
Knowledge and experience in using orchestration tools (e.g. Airflow/Kubernetes)
Proven ability to mobilise others and lead technical change
Strong knowledge and experience in processing big data using Spark
Strong coding expertise in PySpark, or Scala with Python experience
Experience building Spark applications
Experience working in a cloud environment (Azure, AWS, GCP, Yandex)
Experience in building data pipeline/ETL/ELT solutions
Ability and strong desire to research and learn new technologies and languages
Highly knowledgeable in the contemporary data engineering and analytics landscape, the current tools, the latest approaches
Experienced in practices of DevOps and CICD within a data platform
Capacity and enthusiasm for mentoring and coaching data engineers of varying experience
Able to communicate effectively with all levels of the business to help present new ideas and provide recommendations for new technologies
Nice to Have
Advantageous - Full stack web app development
Advantageous - Microsoft Azure cloud technologies e.g. Blob Storage, ADLS, Azure DevOps, Azure Logic Apps, Azure Functions
Advantageous - Delta Lake storage layer
Advantageous - Data visualisation tools e.g. PowerBI, Tableau, Thoughtspot, D3.js
Lead technical design and implementation of data engineering solutions utilizing software development practices on a developing/growing platform adopting containerization technologies such as Docker and Kubernetes.
Lead and facilitate conversations/discussions/plans on how to transition existing notebook-based projects to robust cloud-agnostic deployable products.
Be a key contributor to decisions around the direction of integral data engineering projects/pipelines to serve the growing and changing demands of the data platform across the business divisions.
Collaborate and contribute to the design and build of Spark applications on cloud container infrastructure.
Promote DevOps & CI/CD methodologies to collaborate on a growing platform
Partner with product managers to drive the provision of business value through technical excellence
Partner with scrum masters to efficiently deliver agile technical solutions
Exercise excellent communication to collaborate with various members of the data solutions team
Enthusiastically evolve your technical skillset, engage in training and learn new technologies and techniques