CISO Perspectives: Deep packet inspection
Connecting state and local government leaders
Commentary: Harnessing the power to interrogate inbound traffic for sensitive federal networks is the right thing to do. But why then has it taken federal CISOs so long to make the DPI plunge?
Deep packet inspection (DPI) is a technical process for
examining and interrogating the header information as well as the
payload of network traffic as it traverses network demarcation
points. It has become an essential tool for identifying the
presence of viruses, spam, rootkits, malware and other forms of
malicious logic or non-protocol-complying traffic--and deciding
whether to allow, deny or redirect selected traffic.
DPI actually merges Intrusion Detection System (IDS) and
firewall functionality into a single security engine that sits
'in-line' on the same device. Generally, the DPI engine
inspects each packet of data as it traverses the firewall, and
rejects or allows the packet based on predefined rules. Typically,
the DPI engine compares the packets against a set of rules, which
look at a combination of signatures, heuristics, as well as
statistical or anomaly-based logic.
Harnessing the power of DPI technology to interrogate inbound
traffic destined for sensitive federal networks and systems is
undoubtedly the right thing to do. Why then has it taken federal
CISOs (chief information security officers) so long to make the DPI
plunge? Unfortunately, the answers are not trivial; and sadly they
are often the same ones that plague technology implementations
across the federal government.
A robust DPI implementation presupposes a number of existing
conditions, some of which, unfortunately, continue to be major
challenges for chief information officers as well as CISOs. These
include:
- Precise enumeration of all protocols and application traffic
traversing federal networks
- Centralized control and management of all network and security
operations across the various operational units within large
government agencies
- A 'tipping of the scales' in the traditional
compliance-vs.-risk model employed by most federal CISOs in the
favor of risk over compliance
- A pervasive reluctance to allocate sufficient budgetary
resources to security problems.
Looking at each of these challenges more specifically:
Enterprise Enumeration
Ideally, large degrees of uncertainty should never exist in
dealing with precise protocols and application traffic traversing
federal networks. In reality however, the prevailing stove-piped
approaches to acquiring and deploying federal systems and
applications forces federal CIOs and CISOs into a constant reactive
state. This is due in part to their inability to quickly and
accurately gauge the traffic flowing across their networks. Among
the many downstream effects of this: CISOs have to be overly
cautious to ensure the proper degrees of due diligence before
enabling the 'Active Mode' configuration of their DPI devices.
Their fear: the possibility of self-inflicted denial of service,
whereby otherwise legitimate traffic could get blocked.
Lack of centralized control/management
Another challenge is the fact that the ownership, control,
management and maintenance of federal information technology assets
and operations does not typically fall within the direct and
immediate responsibility of federal CIOs/CISOs. Large government
agencies are still struggling to meet the spirit and intent of the
decade-old provisions of the Clinger-Cohen Act, and thus, the
governance of IT across the Federal enterprise remains disjointed
and fragmented. As a result, shadow IT organizations outside the
agency CIO organizations, resourced by and embedded within business
and functional organizations, are providing localized IT support.
The consequent lack of centralized control and management over
these disparate IT functions increases the technological and
organizational complexities involved in the deployment of
active-mode technologies such as DPI, among other challenges.
Compliance over Risk
As organizational entities, federal agencies are unquestionably
the most audited. An unintended consequence of the excessive audits
is that the dominant cyber security philosophy employed by most
federal CISOs now favors compliance management over true risk
management. Therefore, when asked to prioritize the allocation of
the meager budget dollars, many CISOs are left with little choice
but to allocate those against the highest compliance 'pain
points.'
From the perspective of frontline cyber security practitioners
or incident responders, DPI continues to hold much promise as a
technology that swings the incident response pendulum toward active
detection and response and away from passivity and reaction.
As with most technologies however, DPI is not without downsides
to adoption and deployment. The preponderance of DPI criticisms so
far has been anchored around potential civil liberties violations,
specifically expectations of privacy and net neutrality, i.e.
non-discrimination of packets and open access networks.
Despite the obvious appeal of DPI to federal CISOs in adding a
much-needed proactive layer to their otherwise passive
defense-in-depth security architectures, CISOs have been slow to
adopt DPI for a number of reasons. However, this will no longer be
an option for federal CISOs due to the latest requirements coming
from the White House in the form of the new 'Cyber
Initiative', created in response to recent surges in
increasingly complex and overly aggressive nation-sponsored cyber
attacks against U.S. Government and civilian network
infrastructures.
The most significant publicly disclosed component of the new
Cyber Initiative is the reduction of approximately 4,300 Internet
access points across the federal government, down to approximately
100 and the deployment of "Einstein" sensors on formally approved
federal government Internet access points.
Einstein (Versions I/II) is Department of Homeland Security
(DHS)-sponsored DPI technology that will be collecting information
about federal traffic flows; looking at source, destination,
protocol types, as well as payload data. The technical details of
how the information derived from these sensors will be consumed,
and by whom, is still held closely within the government. The jury
is still out, however, regarding Einstein's abilities to
provide the same degrees of instrumentation, scalability and
robustness in comparison to commercial DPI technology.
NEXT STORY: Authentication device reads palms, not passwords