Barbaricum supports government clients with Integrated Communications, Mission Support, and Cyber Security/Intelligence. Our passion is innovation and we are committed to client value and effective technical solutions. We are an ISO 9001:2015-certified and CMMI Level 3 appraised, Service-Disabled Veteran-Owned Small Business (SDVOSB) located in Washington, DC. Our mission is to transform the ways the U.S. Government approaches problem sets of increasing complexity by delivering innovative solutions, particularly in support of National Security missions. Barbaricum is one of the fastest-growing companies in our market. The company is routinely recognized by institutions like Inc. Magazine, GovCon, AMEC, PRSA, and SmartCEO for corporate growth, capabilities, and award-winning client work. Our team is dynamic and agile, providing global support to current missions across five continents. We are also focused on developing and maintaining our vibrant corporate culture, having been recently named a Best Workplace for 2017 by Inc. Magazine.


Springfield, VA

Senior Data Analytics Engineer

Barbaricum is seeking an experienced Analytics and Data Engineer to support an established development team embedded with a customer at the nexus of the Special Operations and Intelligence Communities. This team has developed a hyper-responsive capability to respond to challenging and evolving requirements to counter unconventional cyber and kinetic threats with the ability to receive, store, process, and make sense of large-scale mission data at the speed of offensive and defensive operations. This capability enhances critical national security assets’ ability to quickly respond to Warfighter’s counter proliferation requirements with technology solutions and services incorporating innovative approaches and the fusion of various data sources and types.


  • Collaborate with a team of software and data engineers to solve technical challenges by proactively identifying novel approaches and implementing solutions that leverage customer knowledge, project-based needs and an understanding of the mission
  • Leverage experience and familiarity with PySpark and Elasticsearch to process, analyze and search petabytes of data, developing analytics tools to better understand and operationalize data pertinent to intelligence challenges
  • Write algorithms with trillions of records using geo-temporal analytics to discover patterns
  • Provide direct support to a Tier-One SOF customers focused on achieving real-world outcomes every day which has the power to remove roadblocks to effectively coding in a classified environment


  • Active DoD TS/SCI Clearance (CI Polygraph preferred)
  • 7+ years of experience in computer science and/or software development field
  • Bachelor’s Degree in Computer Science or other closely related field of study such as Statistics, Mathematics or Engineering provided those fields used programming to solve problems
  • Experience with PySpark ingest, transforms, and aggregations
  • Experience with Elasticsearch queries and aggregations
  • Advanced coding skills in Python, with demonstrated ability to develop custom code solutions on-the-fly (this will be part of the interview process)
  • Exposure to Scala, C++ desired but not required
  • Experience planning and managing new application development architecture and engineering

Experience That Will Set You Up For Success

  • Current or previous mission-focused work experience supporting Special Operations and/or Intelligence Community customers
  • Experience with geospatial data and the development of geospatial-based analytical models
  • Experience with CSV, JSONL ingest to Delta Lake
  • Experience with NiFi for data transfers
  • Experience with DevOps (especially gitlab and ansible)
  • Experience with Neo4j or graph products