Octo

Data Engineer

Job description

You…

As a Data Engineer, you will be joining the team that is deploying and delivering a cloud-based, multi-domain Common Data Fabric (CDF), which provides data sharing services to the entire DoD Intelligence Community (IC). The CDF connects all IC data providers and consumers. It uses fully automated policy-based access controls to create a machine-to-machine data brokerage service, which is enabling the transition away from legacy point-to-point solutions across the IC enterprise.

Us…

We were founded as a fresh alternative in the Government Consulting Community and are dedicated to the belief that results are a product of analytical thinking, agile design principles and that solutions are built in collaboration with, not for, our customers. This mantra drives us to succeed and act as true partners in advancing our client’s missions.

Program Mission…

The CDF program is an evolution for the way DoD programs, services, and combat support agencies access data by providing data consumers (e.g., systems, app developers, etc.) with a “one-stop shop” for obtaining ISR data. The CDF significantly increases the DI2E’s ability to meet the ISR needs of joint and combined task force commanders by providing enterprise data at scale. The CDF serves as the scalable, modular, open architecture that enables interoperability for the collection, processing, exploitation, dissemination, and archiving of all forms and formats of intelligence data. Through the CDF, programs can easily share data and access new sources using their existing architecture. The CDF is a network and end-user agnostic capability that enables enterprise intelligence data sharing from sensor tasking to product dissemination.

Responsibilities...

Deploy best-of-breed commercial cloud and data sharing technologies to modernize the way DoD services and agencies share and analyze intelligence information across the enterprise. In this role you will:

  • Configure, tailor, and maintain all ingest or batch data management components in the cloud platform, such as Apache Kafka, NiFi, or HBase.
  • Communicate with data owners to set up and ensure CDF streaming and batching components are working (including configuration parameters).
  • Document SOP related to streaming configuration, batch configuration or API management depending on role requirement.

What we’d like to see…

  • DoD 8570 IAT Level II Certification
  • Demonstrable CentOS command line knowledge, and working knowledge of Rest
  • Understanding of foundational ETL concepts

Desired Skills:

  • Experience or expertise using, managing, and/or testing API Gateway tools and Rest APIs (desired)
  • Experience or expertise configuring an LDAP client to connect to IPA (desired)
  • Advanced organizational skills with the ability to handle multiple assignments
  • Strong written and oral communication skills

Years of Experience: 5 years of experience or more.

Education: Bachelor's degree in systems engineering, computer engineering, or a related technical field

Location: Chantilly, VA

Clearance: US Citizen w/ ability to obtain TS/SCI w/ CI Poly

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.