(extended abstract)
Accepted for Publication in Air and Space Power Journal
Samuel D. Bass, Maj, USAF, Air Force Institute of Technology
Rusty O. Baldwin, PhD, Air Force Institute of Technology
The Department of Defense is in the midst of transforming its vast collection of information technology systems into an interconnected Global Information Grid (GIG). The GIG will ultimately connect sensors to weapons systems, enable personnel to share information at will, and provide unprecedented levels of situational awareness to commanders at all levels. However, if the GIG is not implemented with a proper level of restriction on the flow of information, warfighters risk being overwhelmed by not only too much information, but information presented at the wrong time, at the wrong level of detail, and without proper analysis and interpretation. In this article, we propose a model to prevent this by directing the flow of information based on its classification level, the data’s integrity, and relevance to the end user.
The Global Information Grid
In response to increasing difficulties sharing information between various platforms and information systems operating in the joint environment, the Department of Defense created the concept of the GIG. DOD policy defines the GIG as “a globally interconnected, end-to-end set of information capabilities, associated processes and personnel for collecting, processing, storing, disseminating and managing information on demand to warfighters, policy makers, and support personnel.” Established GIG policies also implement key components of the 1996 Clinger-Cohen Information Technology Management Reform Act, including information security, revised acquisition strategies, and best practices for handling data at all levels of the DOD. While it can be said that many of the efforts in developing the GIG might simply be the application of DOD acquisitions best practices to the still-maturing field of information technology, the goal of achieving Information Superiority remains paramount and is the primary objective of the overall GIG effort. Connecting personnel and equipment with advanced information sharing tools will likely revolutionize our capabilities, but the quality and volume of information presented to the warfighters of tomorrow must be carefully managed.
The Sand Table
Military commanders have used various models to understand the battlespace for centuries. In the seventeenth century, campaign planners used intricate craftsmen-built scale models of fortifications to analyze points of vulnerability and routes of attack. In the field, leaders have long used sticks and stones in the sand to rehearse maneuvers and depict unit locations and terrain. Aircraft and anti-aircraft technology increased the complexity of the sand table by adding important air components to the planning process. New technology used in Desert Storm provided commanders and bomb-damage analysts a live view from the cockpit and in many cases, from the weapons as they flew into targets. Today, command centers of all levels are equipped with large data walls, where interesting computer or video feeds provide a constant flow of data. Live video feeds from remotely-piloted Predator aircraft are fed in to Air Operations Centers (AOC), providing commanders and intelligence analysts with what some call “Predator Crack” or “Kill TV” because of the display’s ability to divert the viewer’s full attention away from their primary duties. What is shown on the displays and who is responsible for the content is a frequently asked question, and raises an even broader and more important question when considering the future GIG-enabled command center: how will we manage all of the data that will be available on all of the inter-connected platforms?
The Problem of Inverted Perspectives
As prescribed in Joint Doctrine, operations are planned to follow the principles of war, which include among others: surprise, simplicity, security and unity of command. Numerous historical examples illustrate how friendly or hostile knowledge of certain components of plans drastically altered the results of those plans. Still others demonstrate that reaction or a failure to respond to evolving circumstances has a drastic impact on the operation and effectiveness of the leadership involved. Rather than exploring success and failures of operations with respect to the principles of war, consider the implications of operating a GIG-enhanced command center of the future.
For example, a suite of sensors programmed to detect personnel and vehicle movement could collect and report status that might be displayed on a command center data wall, indicating a maneuver by an unknown unit. If this maneuver was that of a friendly special operations mission planned and executed in secrecy, access to this sensor data should be restricted to the same classification level of the mission and not automatically displayed on a data wall where un-cleared personnel might be present. Conversely, if a similar sensor suite detected the footsteps of a single individual in a restricted area, the data collected by this sensor should be presented only to appropriate security personnel, and would likely not be displayed on the same data wall. If the commander’s attention was directed to an unprocessed data point like this, it could force the commander into having an inverted perspective, where the focus on the broader picture is diverted by a single piece of potentially irrelevant data. Similar scenarios could be developed to illustrate how a tactical unit on the ground might see data intended only for a strategic view; any changes to the actions of that tactical unit might eliminate a key component of a strategic plan. We assert that the inverted perspective condition is a very real hazard of information that could be available in a GIG-enhanced battlefield.
In an ideal environment, we would deploy thousands if not millions of sensors across the battlespace to collect climate, audio, video and electromagnetic signal data. Additionally, airborne command and control assets would compose an integrated picture of the battlespace. Current processes and tools such as Air Tasking Orders help deconflict the airspace, but some operations conducted on the ground or on the sea might not be coordinated with all components. A robust sensor net would provide a bridge between these dissimilar components of the battlespace to help prevent incidents of friendly fire, but the composite picture would likely not be relevant to some warfighters. In total, the amount of information collected will be immense and the details of the battlespace available for display will prove tempting to warfighters and leaders at all levels. GIG-enhanced aircraft will have access to a vast amount of information. However, with this comes the potential for unprocessed sensor data to make its way into the cockpit where pilots with increased sensitivity to collateral damage and escalation could be forced to change tactics, select alternate targets, or abort the engagement.
Any number of examples could be presented to demonstrate that data in the GIG should have limits placed on its exposure to avoid the inverted perspectives condition, while still more examples could illustrate that any restrictions on information flow could reduce flexibility. Considering both sides of this argument, we assert that limits should be placed on where the data is automatically transmitted as well as who is authorized to access it. We must also consider that some platforms, as then-Lt. Gen. William T. Hobbins indicated during an interview with Airman magazine, will produce data at different rates while operators in varying roles will consume data feeds at different rates, adding more considerations for a potential solution. Clearly, this paints an amazingly complex picture with fuzzy and continuously evolving operational requirements.
Current Information Flow Management
We are all familiar with the classification levels defined by the National Security Agency (NSA). Data protected with a higher classification level such as “Secret” can be read only by users with a need to know that hold Secret or higher clearances. Similarly, readers with a high classification level can normally read any material at or below their classification level assuming they need to know. In a conceptual GIG-enabled virtual command center, information specific to a sensitive operation could be classified at a high enough level to prevent those who hold lower-level classifications from reading the data. Furthermore, display of data relevant to those classified operations could be reserved for those with the required need to know. Additionally, data on command center displays must be at the lowest clearance level of personnel with access to the displays.
Using a well-disciplined approach, data from all sources could be properly secured or sanitized enough to prevent users from seeing information not cleared for their consumption. But thus far, we have only addressed the proper treatment of data with respect to confidentiality. The integrity or trustworthiness of the data is also of prime importance, particularly in urban areas where the need for very accurate and timely data is great, and therefore, so is the need to rapidly evaluate raw data and prepare it for presentation to leadership. Normal data classification techniques do not classify information based on its integrity, so we need to explore a method to help categorize data that could cause an inverted perspective hazard in a GIG-enhanced picture of the battlefield, whether that data is unprocessed remote sensor data or imagery that has not yet been evaluated by intelligence personnel.
A Proposed Flow Management Scheme
Our information sharing mechanism must enable meaningful and adaptive information sharing capabilities within a command center. Consider a command center staffed with personnel of varying clearances and areas of functional expertise, similar to other command centers such as wing command posts (WCPs), expeditionary operations centers (EOCs), or AOCs. As in Biba’s model, both personnel and systems can create and consume data and are referred to as subjects, while the documents or virtual products produced are referred to as objects. Our information-sharing mechanism assigns three ratings to every subject and object: classification, relevance, and integrity.
Suppose the classification levels for subjects and objects are Unclassified, For Official Use Only, Secret, or Top Secret. For simplicity, our model will not address clearance caveats or clearances for personnel from other countries, but they could be readily incorporated. The relevance and integrity levels of subjects and objects will be Low, Medium or High. Personnel classification levels normally do not change over time, but personnel can induce and experience changes in integrity levels and will produce objects of varying relevance levels. Similarly, documents and processing systems often have the same ratings as their content or inputs. For our command center, we propose the following rules, which are enumerated below before we discuss the implications of the rules in the following section. All information sharing transactions must occur in accordance with the following rules (based on those proposed by Lipner):
Back in the Command Center
In order to implement these rules in a command center, some processes will be completely automated, others will be handled exclusively by personnel in various career fields or leadership positions, and several rules must be implemented by both systems and personnel. Once objects are transferred to paper form, traditional processes such as classification controls and need-to-know restrictions are personnel responsibilities, while digital information flow can be restricted using various mechanisms. Rules three and five, however, require humans to interpret data and make changes to integrity and relevance levels based on that interpretation. Intelligence and operations personnel will normally be in the best position to change integrity and relevance levels, depending on the specifics of the situation. In order to enforce both rules, the processes must be well understood and mechanisms to effect integrity and relevance changes must be properly restricted.
Conclusion
It is clear that we operate in a politically complex environment, and many operations are conducted in the focal point of a 24-hour news cycle. Missed opportunities to engage high value targets and incidents of collateral damage have equal probability of becoming headlines, and both can raise questions about our military effectiveness. As a result, commander appetite for information will continue to grow as will demands that future systems be interconnected via the GIG. Our efficiency and ability to rapidly fuse, analyze and convert raw data into actionable intelligence will depend on the capabilities of future systems and the processes that govern their implementation. We believe that the Classification, Integrity and Relevance rules described above will help guide the development of systems that will maximize data fusion while avoiding the pitfalls of conditions such as inverted perspectives. Significant development of these rules using a simulated command center and information processing systems is needed, but we believe the examples in this article demonstrate the benefits of using them.