5 steps to secure your data center
Connecting state and local government leaders
With the advent of cloud computing, rich Internet applications, service-oriented architectures and virtualization, data center operations are becoming more dynamic with fluid boundaries. The shifting form of computing adds layers of complexity that have broad implications for how IT managers secure the components that make up a data center.
With the advent of cloud computing, rich Internet applications, service-oriented architectures and virtualization, data center operations are becoming more dynamic, with fluid boundaries.
The shift toward a new computing environment adds layers of complexity that have broad implications for how information technology managers secure the components of a data center to protect data from malicious attack or compromise.
Organizations should bake security into the design of the data center, said Henry Sienkiewicz, technical program director of the Defense Information Systems Agency’s Computing Services Directorate.
“The data center is an entire ecosphere, which has to be looked at as individual components but also holistically,” Sienkiewicz said. “If we don’t do that we will miss something.”
IT managers are increasingly looking to ensure completely secure transactions, which means securing everything from the desktop and network to applications and storage, said Jim Smid, data center practice manager for Apptis Technology Solutions. Traditionally, the network support team secured the networks, and the application team would handle data encryption. But the present and emerging environments call for new methods, Smid said.
Organizations need to monitor that all data center operations interact correctly and that each element of the data center is secure, he said.
In addition, federal agencies must manage the policies, people and technologies needed to secure dynamic, fluid data centers. Every security layer is important, so it is hard to say if one is more important than another, data center security managers and industry experts say.
Organizations such as the Cloud Security Alliance, a coalition of industry leaders, global associations and security experts, have published guidance to promote best practices and provide education on the use of cloud computing. The consortium has released guidelines that cover 15 security domains, ranging from computing architecture to virtualization, that organizations can apply to data center security.
However, based on conversations with several data center security managers and industry experts, GCN has formed a list of five things to consider when securing a data center.
Agencies' first step is to consider whether they want to continue to maintain their own data centers or outsource the task, Smid said. Many agencies are carefully evaluating that option, trying to identify an application that might be suitable for outsourcing to test the waters.
During the past six months, transparency has emerged as a major factor that will influence agencies' data center decisions, Smid said. For government agencies to feel comfortable with outsourcing — whether to another agency or an outside company — they will want to know the security level of their data.
1. Get physical — control physical access to the data center
Some data center managers might start with harder tasks, such as controlling access to each system or the network layer. But Corbin Miller, IT security group manager at NASA’s Jet Propulsion Laboratory, prefers to start by locking down physical security to the data center.
“A lot of people will forget the physical because there is so much on the network side,” Miller said.
At a Federal Aviation Administration data center in Oklahoma, layered security is increasingly popular, said Mike Myers, former enterprise services center IT director at FAA, during a recent Federal Computer Week e-seminar on data center security. At that center, physical security includes a fenced-off campus; badge access to the main building and data center; a guard who escorts visitors; key card admittance to rooms; video surveillance of the data center; and locked cages for servers, depending on the sensitivity of the data that they contain.
Miller also is working to establish layers of physical security at JPL’s data center to partition testing, development and production areas.
The center’s manager wants to set up a development laboratory in the data center, but Miller wants to keep it separated from the production area that is home to systems that keep JPL’s operations running.
“I want to keep the production zone at the highest security level,” only allowing authorized systems administrators into the area.
“So I envision three zones within that one data center,” he said. One zone would be for researchers to test and stage equipment, one would provide more control over which development work on applications and systems is performed before putting them into production, and a production zone, which only core systems administrators could access.
For the inner layers, you don’t necessarily need badge access, but you should have some type of access control, such as locks on server racks, for production systems, Miller said. “I just don’t want the accidental lab guy that is working on his equipment to say, ‘I need power. let me plug it in here,’ and he overloads the power circuit for production,” he said.
2. Establish secure zones in the network
After you have physical security processes in place, the hard work begins: securing the network, Miller said.
“I would concentrate on zoning into the network layer,” he said. At JPL, “the first zone is a little bit looser environment because it is a development area. The next one is a test subnet, which is isolated from the random traffic of the development area but looser than the production area,” Miller said.
The third zone, the production or mission support subnetwork, is where systems administrators spend a lot of time and effort. That zone has only approved production equipment — so administrators must deploy new systems to the production network in a controlled manner, Miller said.
At JPL, administrators can physically or virtually deploy systems to subnetworks attached to virtual local-area networks, and they can set strict rules about incoming or outgoing traffic. For example, administrators could deploy mail servers to Internet Port 25 or Port 80, which should not affect approved traffic in that zone.
Data center managers need to consider the types of business that various subnetworks will handle, Miller said. Applications such as e-mail and some database-monitoring activities would use ports that link to the outside world. However, those production machines shouldn’t be going to CNN.com, ESPN.com or Yahoo News. After administrators set those rules, they can better detect anomalous activity, Miller said.
By the time a machine moves onto a production network, you should know what it is running, who has access to the operating system layer and what other systems it is communicating with across the network, Miller said.
“Now you can better understand where to put your security monitors or data leakage prevention monitors,” he said. “If I know only three machines are going to be talking on the Web, it is easy for me to watch traffic and look for specific things.”
Although wireless networks are popular, Miller said wireless access points are not necessary in a data center. They’re difficult to control even with Radius and two-factor authentication, Miller said.
DISA takes a three-pronged approach to data center security, Sienkiewicz said.
The first part is NetOps, the operational framework that ensures that the Defense Department’s Global Information Grid has availability, protection and integrity. The second is technical protection, and the final piece is accreditation and certification of applications, he said.
NetOps consists of the GIG Enterprise Management, GIG Network Assurance and GIG Content Management. DISA has devoted specific people, policies, processes and business support functions to operate NetOps, Sienkiewicz said.
On the technical side, the DOD demilitarized zone is a focal point. All of the Defense Enterprise Computing Centers’ traffic funnels through DOD and DISA demilitarized zones.
As a result, Internet connections to DOD Web servers are inspected and managed from the Internet access point all the way to the host machine. There also is physical separation between Internet-accessible Web infrastructure and other DECC infrastructures and logical separation between users and server types.
With that setup, DISA can limit access points, manage command and control, and provide centralized security and load balancing across the environment.
Additionally, “we run an out-of-band network, so production traffic does not cascade into the way we manage the infrastructure,” Sienkiewicz said. Through virtual private network connections, users can manage their own environments. The VPN connections provide paths for production hosts to send and receive enterprise systems management traffic.
3. Lock down servers and hosts
At the FAA facility in Oklahoma, all servers are registered in a database that contains contact information and details about whether the servers contain privacy information. Most of the database is manually maintained, but the process could be improved by automation, Myers said. Problem areas have been change and configuration management — some of those processes are automated, and some are manual. FAA is working with Remedy software to improve automation.
Server security is standardized and subjected to Statement on Auditing Standard 70 (SAS 70) and annual inspector general audits. FAA also has standardized on National Institute of Standards and Technology security checklists that are available on the agency's Web site. In addition, FAA is implementing patching programs and tracking vulenerabilities on servers by scanning them at least monthly.
FAA handles data security separately from server security. FAA is doing more appliance encryption than software encryption, which is too restrictive and poses system compatibility problems. The agency set up firewalls to separate private data from government data. FAA also uses scanning technology to monitor data in motion for potentail privacy leaks. The scanning technology seeks to ensure that data goes to the right recipients and is properly encrypted, Myers said.
At DISA, data center managers are working to address security concerns caused by server virtualization. “When we look at virtualization, that notion of how do we increase server virtualization has brought on new security issues,” Sienkiewicz said.
A couple of questions that DISA security managers must answer are “how do we ensure that the hypervisors are locked down? How do we make sure that additions, deletions and moves are properly protected? VMware has been a good partner helping DISA to work through security attributes of virtualization,” he said.
“One of the big things we have used for virtualization is separation and isolation,” Sienkiewicz said. “We do try to separate applications, Web services, application services, database services into physical separate racks, so that there is no possibility of data linkage or spillage or something else happening in the environment."
“We are also hardening other parts of the environment,” he said.
As with FAA, DISA requires hosts to be registered on a white list. The agency also has installed host-based security systems, which monitor and detect malicious activity.
“Are we completely there yet? No. Do we have an aggressive plan? Yes,” Sienkiewicz said. DISA's broad perspective strives to protect each individual host.
Lastly, DISA is using the public-key infrastructure initiative it runs for DOD to manage physical security. Users must log on to systems with a Common Access Card, which provides two-factor authentication.
“That is giving us digital data signing — the ability to encrypt e-mail traffic — and it is a real-time certification program,” Sienkiewicz said.
4. Scan for application vulnerabilities
Application-scanning and code-scanning tools are important, NASA’s Miller said.
At JPL, if someone wants to deploy an application, it must undergo a scan before administrators release it in the production environment. Miller uses IBM Rational AppScan to look at Web applications. AppScan tests for vulnerabilities that hackers can easily exploit and provides remediation capabilities, security metrics and dashboards, and key compliance reporting.
On the other hand, developers who write their own code must run it through a code scanner, which could be a Perl script that looks for specific functions or a fortified product that scans source code for buffer overflows or other vulnerabilities that crept into the code, Miller said.
5. Coordinate communication between security devices for visibility into data flows
With cloud computing, agencies need to change their whole approach to securing the data center, said Tim LeMaster, director of systems engineering at Juniper Networks.
“In cloud computing, it is about securing the data flows between data centers, client systems and data center, and between virtual machines within the data center,” LeMaster said. Therefore, application visibility becomes important.
“You have to have visibility into these flows to validate that the traffic is legitimate and is not malware [because] a lot of malicious traffic tries to mask itself as something else.” A lot of that traffic uses Port 88 or tunnels with Secure Sockets Layer encryption. Network administrators must have the knowledge and application identification to understand what that traffic is, he said.
Juniper has developed application identification technology that looks beyond port protocols to the context of the data and tries to apply signatures that help determine if an application is really a shareware program or peer-to-peer program.
Juniper's technology also focuses on application denial-of-service attacks. Denial-of-service attacks are not new, but the traditional way to counter the attack was to “black hole” the traffic. But that approach helps the denial-of-service attack accomplish what it intended to do — deny services — because a network administrator must remove all traffic from the server that is under attacked.
Application denial-of-service prevention software provides a profiling capability for administrators to determine if traffic is legitimate or not. With such tools, administrators can look at other data flows, such as client-to-server traffic, and compare them to flows that exist in other data centers or between servers — or between virtual machines within the virtualized data center.
“You can have traffic between the virtual machines that can escape the normal security appliances or services you offer,” LeMaster said. Juniper has partnered with a company to offer an intervirtual machine firewall capability, he said.
“I would like my intrusion prevention to see that malicious worm [and] not just drop it but to talk with the SSL device and eliminate only that bad session,” he said.
The concept of coordinating networking devices, firewalls, SSL devices, and intrusion prevention solutions becomes useful in a cloud computing infrastructure. Juniper is working with the Trusted Network Connect Work Group, a consortium of users and service providers that published standards that will allow security components made by different companies to share information about a device, LeMaster said.