• Cloud Software Engineer

    Job Location US-MD-Annapolis Junction
  • Overview

    ASRC Federal - Vistronix is a national security solutions provider specializing in transforming big and complex data sets into mission critical intelligence.  Ingesting, processing, and exploiting Big Data is at the core of everything we do: Cyber & SIGINT Operations, C4ISR & Multi-INT Processing, and Enterprise & Open Source Analysis.  As a national security middleweight, we have a passion for our customers' mission and value ingenuity, agility, speed, and the ability to think and deliver at scale.  For more information, visit www.vistronix.com.


    ASRC Federal - Vistronix is currently seeking an experienced Cloud Software Engineer for a role on one of our subcontracts.  This full time position affords the successful candidate the opportunity to work in an engaging, high-tech development environment, working with an excellent team and customer base. 





    Candidate must meet the following labor category qualifications:

    • Shall have at least 5 years of general experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution. (Note: A bachelor’s degree in computer science, engineering, mathematics or a related discipline may be substituted for 4 years of general experience.)
    • Shall have at least 3 years of experience developing software with high level languages such as Java, C, C++
    • Shall have at least 2 years of experience with distributed scalable Big Data Store (NoSQL) such as HBase, CloudBase/Accumulo, Big Table, etc., as well as 2 years of experience with the Map Reduce programming model, the Hadoop Distributed Files System (HDFS), and technologies such as Hadoop, Hive, Pig, etc.
    • Shall have demonstrated work experience with Serialization such as JSON and/or BSON.
    • Shall have demonstrated work experience with developing restful services, Ruby on Rails framework.
    • Shall have at least 3 years of experience developing software in UNIX/Linux (Red Hat versions 3-5+) operating systems.
    • Shall have demonstrated work experience in the design and development of at least one Object Oriented System.
    • Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
    • Shall have at least 3 years of experience in software integration and software testing, to include developing and implementing test plans and test scripts.
    • Shall have demonstrated technical writing skills and shall have generated technical documents in support of software development projects.
    • In addition, the candidate will have demonstrated experience, work or college level courses, in at least 2 of the desired characteristics.
    • Shall have demonstrated work experience with Source Code Management (e.g. Git, Stash, or Subversion, etc.)

    Labor Category Desired Qualifications:

    • Experience deploying applications in a cloud environment.
    • Understanding of Cloud Scalability.
    • Hadoop/Cloud Certification.
    • Experience designing and developing automated analytic software, techniques, and algorithms.
    • Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
    • Experience developing and deploying: data driven analytics, event driven analytics, sets of analytics orchestrated through rules engines.
    • Experience with linguistics (grammar, morphology, concepts).
    • Experience developing and deploying analytics that discover and exploit social networks.
    • Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
    • Experience developing and deploying analytics within a heterogeneous schema environment.
    • Experience with LDAP protocol configuration management and cluster performance management (e.g. Nagios)

    Job Specific Preferred Qualifications:

    • Strong Java skills is required
    • Experience developing in Linux environments is required
    • Experience writing Hadoop / Map Reduce Analytics
    • Experience with Hbase or Accumulo databases
    • Configuring Apache Niagarafiles / Creating Apache Niagrafile components
    • Tomcat
    • Maven
    • Experience with version control (Git/Subversion)
    • Development experience using Spring
    • Experience with customer’s GM and MASH architectures is a plus
    • Must be able to work in a team environment

    This position requires an active Security Clearance.


    Positions require a Top Secret security clearance, based on current background investigation (SBI), as well as the favorable completion of polygraph.  Clearance and polygraph processing will be completed by the U.S. Government.


    ASRC Federal and its Subsidiaries are Equal Opportunity / Affirmative Action employers.  All qualified applicants will receive consideration for employment without regard to race, gender, color, age, sexual orientation, gender identification, national origin, religion, marital status, ancestry, citizenship, disability, protected veteran status, or any other factor prohibited by applicable law.


    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed