You as senior technologist will be part of the strategic drive to in working along with multi-discipline teams through the full delivery lifecycle of complex data products and pipelines with a clear understanding of large-scale data engineering and data science solutions, deliver use cases for one of our customers and develop practical solutions and implement them to give this customer a competitive edge with in the enterprise.
It is essential that you showcase a deep understanding and extensive hands on experience in the Architecture Design, Building solutions using broad range of related tools, Hadoop, Real time streaming technologies. It is also essential that you can demonstrate previous commercial experience with Azure cloud services and working with distributed relational and NoSQL/Graph databases
You will have breadth of experience in both engineering and architecture across technology disciplines and the unique challenges of each, including software development, automated test and quality assurance, data and integration.
Delivering a high quality and innovative data driven solution to our client’s advance analytical needs. This will include, working within a team on the clients site: understanding the client’s pain, designing an analytical approach, implementing a solution, ensuring it is of high quality, and leading and mentoring multi discipline technology teams.
Our ideal candidate
Minimum 10+ Years’ Experience
- Proven, Strong Data Processing skillset with experience in Azure and Hadoop tools and techniques. For example (not exhaustive):
o Azure ADF, Databricks, Azure Functions
o Good knowledge in real time streaming applications preferably with experience Apache Nifi/ Kafka Real time messaging or Azure Stream Analytics / Event Hub.
o HBase modelling and development
o Spark processing and performance tuning
o File formats partitioning for e.g. Parquet
o Unix Shell Scripting
o Solid experience utilizing Source code control software (e.g. GIT, Subversion)
o Multi-threaded Programming
- Strong experience in Architecting and delivering Data Solutions using Cloud platforms
- Knowledge and Experience in Azure based big data services.
- Experience of synchronous and asynchronous interface approaches
- Experience of developing and deploying solutions that use the Elastic Search stack;
- Experience of designing and developing systems using micro services architectural patterns
- DevOps experience in implementing development, testing, release and deployment processes using DevOps processes.
- Consultative approach with regard to identifying client issues and commitment to delivering high quality solutions
- Strong architectural design and implementation work experience in TDD, CI/CD, Kubernetes & Docker based projects
- Fluent in English
Nice to Have
- Preferably Azure Cloud or any Big Data certifications
- Good knowledge if not experience in at least one of Java, Scala, Python with knowledge of the others
- Experience in delivering big data solutions using a leading Hadoop distributions like Hortonworks / Cloudera