Curo Services

Senior Data Engineer - SC Cleared

Job description

Senior Data Engineer SC Cleared (RL6770)

Location Bristol/WFH (Hybrid)

Skills, Experience, Qualifications, If you have the right match for this opportunity, then make sure to apply today.

Salary - £70 - £80K per annum + Bonus

Benefits Bonus, flexible working hours, career opportunities, private medical, excellent pension, and social benefits

The Client - Curo are collaborating with a global edge-to-cloud company advancing the way people live and work. They help companies connect, protect, analyse, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today s complex world.

The Candidate This is a fantastic opportunity for a bright, driven, and customer focused Senior Data Engineer. The ideal candidate will have any/all of the following: RHCE certification, Bash or Python Scripting, Automation using Ansible, Kubernetes administration, basic knowledge of Spark.

Please note this role requires eligibility for SC Clearance.

The Role As a Senior Data Engineer you will be responsible for delivering a variety of engineering services to customers world-wide. Assignments will vary based on the successful candidate s skills and experience. Typical assignments may involve cluster installation, ETL, solution design and application development, platform services. The growing client base is made up of Fortune 50 companies, and the assignments will be challenging, and immensely rewarding. Reporting into the Global Delivery Manager, this role offers the opportunity to learn and apply big data technologies and solve related complex problems.


  • Mastering the Data Fabric and Container Platform, including MapR-FS, MapR-DB Binary and JSON Tables, MapR-Streams, Kubernetes and Eco-System products, maintaining proficiency and currency as the technology evolves and advances.
  • Achieving proficiency with cluster and framework sizing, installation, debugging, performance optimization, migration, security and automation.
  • Working effectively with MapR DB Binary and Json Tables sizing performance tuning and multi-master replication.
  • Event Stream sizing, performance tuning and multi-master replication.
  • Ensure Professional Service engagements are delivered to the highest standards on time and budget.
  • Acting as a technical interface between the customer (Data Science/Data Analysts) and the delivery team, point of escalation between Customer and Product Engineering.
  • Providing best practice in exploiting the software to meet the Customer Use Cases.
  • Providing technical thought-leadership and advisory on technologies and processes at the core of the data domain, as well as data domain adjacent technologies.
  • Engaging and collaborating with both internal and external teams and be a confident participant as well as a leader
  • Assisting with solution improvement.
  • Being a technical voice to customers and the community via blogs, User Groups (UG s) and participation at leading industry conferences.
  • Staying current in best practices, tools, and applications used in the delivery of professional service engagements.


  • 5+ years of experience administering any flavour of Linux
  • 3+ years of experience in architecting, administrating, configuring, installation and maintenance of Open Source Big-data applications, with focused experience on MapR distribution
  • 3+ years hands-on experience with supporting Hadoop ecosystem technologies
  • Source Big-data applications, with focused experience on MapR or CDP distribution.
  • Expertise in administration of MapR DB/Hive/HBase/Spark/Oozie/Kafka.
  • Strong Scripting skills (Bash or Python preferred).
  • Familiarity with commercial IT infrastructures including storage, networking, security, virtualization, and systems management.
  • Good understanding of High Availability.
  • Able to implement Hadoop Data Security using Groups/Roles.
  • Ability to implement and manage Cluster security.
  • Ability to troubleshoot problems and quickly resolve issues.
  • Cluster maintenance as well as creation and removal of nodes.
  • Performance tuning of Hadoop clusters and Hadoop MapReduce routines.
  • Screen cluster job performances and capacity planning.
  • Monitor cluster connectivity and security.
  • Collaborating with application teams to install operating system and MapR updates, patches, version upgrades when required.
  • Integration with other Hadoop platforms.
  • Familiarity with either Ansible, Puppet or Chef.
  • Familiarity with Kubernetes.
  • Proficiency in basic Java or Scala programming (preferred but not required).
  • Bachelor's degree in CS or equivalent experience.
  • Strong verbal and written communication skills are required.

To apply for this Senior Data Engineer permanent job, please click the button below and submit your latest CV.

Curo Services endeavours to respond to all applications, however this may not always be possible during periods of high volume. Thank you for your patience.

Curo Services is a trading name of Curo Resourcing Ltd and acts as an Employment Business for contract and temporary recruitment as well as an Employment Agency in relation to permanent vacancies.

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.