Rethinking computing for next-level problems
Connecting state and local government leaders
With current computer systems struggling to keep pace with ever more complex workloads, intelligence and defense agencies are looking for new approaches to solving data-intensive problems.
With the volume, variety, velocity and complexity of data threatening to overmatch systems sifting through ever increasing amounts of data, intelligence and defense agencies are looking for new approaches to solving data-intensive problems.
Currently, data the intelligence community focuses on is increasingly sparse, random and heterogeneous, creating data-intensive problems that today’s computers were not designed to solve, the Intelligence Advanced Research Projects Activity (IARPA) said in a Dec. 11 broad agency announcement.
IARPA’s Advanced Graphic Intelligence Logical Computing Environment (AGILE) program seeks to transform “massive, random, heterogeneous data streams and structures into actionable knowledge.“ That task will require “system-level intelligent mechanisms for moving, accessing and storing large, random, time-varying data streams and structures that allow for the scalable and efficient execution of dynamic graph analytics workflows,” the agency said in the BAA.
Rather than enhancing system components like memory or processing, AGILE is looking for fresh approaches that fundamentally reimagine computer systems for current and future data-intensive operations. In fact, IARPA said that “proposed designs should not be constrained by existing component interfaces and protocols, legacy architectures, or current practices.”
AGILE will require new memory and interconnection architectures, massive data throughput, rapidly accessible high-density storage as well as innovative advanced microelectronics. Architectural designs should demonstrate scaling and efficiency requirements, optimization of the full integrated system and a technology pathway for future performance gains, the BAA said.
A proposers’ day will be held Dec. 22.
The Pentagon, meanwhile, is taking a more evolutionary approach.
The Defense Department’s High Performance Computing Modernization Program (HPCMP), which operates world-class supercomputing centers, the high-bandwidth Defense Research and Engineering Network along with sophisticated software and security infrastructure, is looking to move its high-performance computing and data analysis to the commercial cloud.
Traditionally, HPCMP upgraded its capabilities by bringing new supercomputers online and increasing security, bandwidth, coverage and performance by software and networking updates.
Now, after a decade of watching how cloud computing has been optimized for artificial intelligence, machine learning, big data analytics and digital engineering, HPCMP “recognizes a convergence of forces that suggest the time is right to consider augmenting its on-premise ecosystem with an investment in commercial cloud-based infrastructure,” it said in a Dec. 8 request for information.
Since commercial cloud vendors invest more in their systems, offer flexible capacity and pricing, run environments for experimentation and can provide access to new architectures faster than HPCMP, commercial cloud makes sense, the RFI said, especially if moving some unclassified workloads to the cloud would create more capacity for inherently government workloads.
Additionally, HPCMP said it may be called to support new missions, such as AI, digital transformation and digital engineering, areas currently outside its mandate, where the cloud could be beneficial.
HPCMP said it is looking for a cloud-agnostic, high-end computing ecosystem to augment its on-premise environment and solve its research engineering, test, evaluation and acquisition engineering problem sets. Ultimately, it said it wants a commercial cloud that can “interface with the HPCMP ecosystem in a way that complements/augments on-premise assets and improves overall operating efficiencies and costs.”
Responses to the DOD solicitation are due Feb. 15.