Network Security

Virtual reality: Adoption of virtualization

Adoption of virtualization is continuing – with caveats, says Guy Buzzelli, CIO of Delray Beach, Fla. Dan Kaplan reports.

Please forgive Guy Buzzelli if he sounds a little too matter-of-fact when discussing virtualization. The CIO of the city of Delray Beach, Fla., a 65,000-person beach town about an hour's drive north of Miami, says the process of migrating from a traditional IT environment to one powered by virtual machines (VMs) has gone about as smoothly as possible.

In other words, there were no frantic screams emanating from Buzzelli's 11-member department, which oversees IT for Delray Beach's 750 public employees, when it downsized 42 physical servers to five, and replaced them with 45 VMs.

“It's just like we're putting up another server,” he says. “Some people may be scared to virtualize, but it's more fear of the unknown. We don't treat the virtualization servers any different than the physical servers when it comes to security. We treat them the same. Security is security.”

Most organizations, large and small, have accepted virtualization as the future of their data centers. Earlier pilots largely were used for testing and development scenarios. Now, many organizations leverage virtualization – known for its speed, efficiency and agility in production environments – to deploy mission-critical workloads.

But it is also a technology nascent enough that its security ramifications have taken a backseat, in many organizations, to the immediate and significant cost savings, disaster recovery benefits and resource utilization that server consolidation reaps.

There also is an issue of buy-in and governance. According to an April report from Prism Microsystems that studied the state of virtualization and security, 85 percent of respondents said they have adopted the technology “to some degree.” However, the 302 IT managers who responded cited a lack of budget and staff expertise as the major inhibitors to safeguarding their virtual environments.

Budget decision-makers certainly like the cost benefits and productivity gains that virtualization presents, but they are not necessarily thinking about the need to properly secure it. “If you think about how humans behave, they sort of implement now and worry about it later,” says Steve Lafferty, vice president of marketing at Prism Microsystems. “That's kind of what's happening with virtualization. Nobody is saying that you have to put this much money aside for security.”

Complicating the picture even further, vendors have been slow to develop solutions that allow organizations to effectively secure and manage these unique environments.

“Research shows that customers' number one concern [around virtualization] is security,” says Frank Gens, senior vice president and chief analyst of IDC, in a December research note. “As IT resources are shared in a virtualized or cloud environment, customers worry about whether their applications and data will be more vulnerable to tampering, theft or loss.”

But nothing about virtualization makes it necessarily more risky than the traditional IT layer, say experts, even though through 2012, 60 percent of virtualized servers will be less secure than the physical servers they replace because companies will fail to involve the IT security team in its deployment projects, according to a recent Gartner report.

Most security experts cite an exploit known as “VM escape,” in which malicious code breaks out of a guest machine and infects the underlying host, as the dreaded worst-case scenario for virtualization. While there are few real-world reports of such attacks, white-hat researchers have presented proof-of-concept scenarios. The possibility gets even scarier when virtualization is running in the cloud, and applications belonging to multiple organizations can be victimized by a single compromise.

“All I need is one unpatched system,” says Dipto Chakravarty, vice president of engineering at Novell. “If I get around one machine that is unpatched in your data center, that's good enough to get in. Then, in seven keystrokes, you can bring a data center to its knees.”

Necessitating a culture shift

Realistically, however, poor management of VMs – involving disciplines such as configuration, auditing and training – pose the greatest risk, experts say.

Chris Hoff, director of cloud and virtualization solutions at Cisco, says a major drawback to virtualization is the very thing that makes it so appealing – its ease of use. Because virtual images can be erected and taken down so quickly, security teams often are left in the dark. This can lead to an overpopulation of VMs, known as VM sprawl, which can result in a multitude of negative consequences. For example, a VM containing all the latest updates could be created today, then quickly turned off and forgotten about, only to be switched back on months later while running out-of-date patches.

“If I provision a whole bunch of virtual machines, I can turn them off and on in a matter of seconds,” Hoff says. “With virtualization, you can do all of that without involving the networking and security team.”

Delray Beach's Buzzelli oversees an 11-member IT department, so there is little concern on his watch about a lack of collective security consciousness around virtualization. If someone installs a VM, chances are, everyone else is going to know about it. After all, a network engineer in the city doubles as its security officer.

Still, Delray Beach is taking the principle of least privilege seriously. “We gave partial administrator rights as needed to some employees to access virtual servers,” Buzzelli explains. “These administrator rights gave them the ability to work with a virtual server just as if it was a physical server, but no VMware administrative capabilities were given. We have a small number of employees in the IT division that have full administrator rights to the entire VMware environment.”

The transition to virtualization has saved Delray Beach 40 percent on combined air conditioning and electrical costs, Buzzelli says. When all is said and done, all applications will be virtualized. The city already has migrated over a number of production applications, including email. Financial applications and the municipality's 911 call center are expected to be moved over in the next 12 to 24 months.

To protect these precious assets, the city is using virtual-aware security tools, such as anti-virus. But one of its key deployments is TriGeo Network Security for security information and event management (SIEM) of its virtual server environment.

“We use the product…to collect, respond if needed, and report unauthorized access attempts,” Buzzelli says. “This solution encompasses both the physical and virtual environments from firewalls to anti-virus, basically the entire network. TriGeo Security Information Management combines real-time log management with sophisticated active response rules that are proactive and have the ability to block, quarantine and control services, accounts and network privileges 24/7.” 

But while VMs are easier to manage than physical servers, they are saddled with risk because many internal policies have not been customized to meet the demands of the new technology, says Michael Maloof, CTO of Post Falls, Idaho-based TriGeo. Many times, organizations will fail to catch network deficiencies because virtualization falls “outside the scope of normal auditing mechanisms,” he says.

“We've identified a number of customer sites where they've had to put policies in place for starting, stopping, moving, copying and internal auditing [of VMs],” Maloof says. “It was entirely different when you were talking about having to pop out drives and move boxes around physically. When you can do something remotely, it has a lot more risk.”

Compliance, though, could change this mindset. The next version of the Payment Card Industry Data Security Standard (PCI DSS), due out in the fall, is expected to include guidelines around protecting cardholder data in virtualized environments, such as in the DMZ – an area that sits between the internet and an organization's internal network that commonly runs applications that need both public exposure and database connectivity.

As it stands now, the PCI rules make no mention of virtualization. But change is going to be necessary. For example, the guidelines state that only one primary function should be implemented per server, and that networks should be physically separated between these functions, Hoff says. This runs counter to the common virtualization framework where host servers each run many virtual operating systems – all without very transparent separation and isolation.

And with desktop virtualization serving as the next wave of mainstream adoption – migrations to Windows 7 should hasten this cause – compliance requirements, IT policies and vendor tools all appear to have some catching up to do.

Security offerings evolve

From a tools standpoint, help is on the way. Virtualization platform-makers, such as VMware, as well as security vendors are racing to push out technology to satisfy both compliance and risk management initiatives, in addition to the scalability, resiliency and performance requirements of next-generation data centers, Hoff says.

Organizations, of course, should protect their virtual assets much in the same way they do their traditional IT assets, say experts. That means running tried-and-true security solutions, such as anti-virus, firewalls and intrusion detection and prevention.

The problem with such an approach is that, because security is such a computer-intensive process, organizations run the risk of losing the cost benefit of virtualization if they are running separate security offerings for each VM, says Punit Minocha, senior vice president of data center security and cloud computing at Trend Micro. That is why the security company worked with virtualization-platform vendors to construct a single VM that is used to provide all of the necessary security for all other VMs residing on a particular host machine, he says.

Shifting to more specific controls, experts recommend running solutions that manage the unique complexities of virtualization. These include sprawl analytics to keep track of VMs, tools to secure and manage virtual workloads and, on the governance front, offerings that allow for the separation of duties and that provide greater visibility into IT resources.

Of course, the biggest prize to protect is the mighty hypervisor, the core middleware software that creates, controls and monitors the multiple VMs that run on the host machine. Many security providers have created virtual-aware tools by leveraging VMsafe, a set of application programming interfaces that allow these vendors to develop products that can easily integrate into VMware's own hypervisors.

Back to school

Academic research also is underway. A North Carolina State University professor and his Ph.D. student have developed a prototype, known as HyperSafe, which prevents malicious code from being injected into the hypervisor, except by an administrator.

“HyperSafe is designed to block [attack code], even with the assumption that exploitable bugs are present in the hypervisor software,” says Xuxian Jiang, an assistant professor of computer science, who presented his research in May at a conference in Oakland, Calif.

Do not expect the prototype, which was funded by the U.S. Army Research Office and the National Science Foundation, to hit the shelves anytime soon. The software cannot be enabled without the source code of the hypervisor being altered by its maker.

“[However] our prototyping experience indicates that this process is relatively easy to accomplish,” Jiang says.

But as with any security risk, software cannot solve everything. Employing the combined framework of people, processes and technology leads to the most desired result. Delray Beach is no stranger to this mantra.

“We're very proactive here,” Buzzelli says. “It goes back to the organization. If you're lax on security, you're going to have issues. You have to do best practices.”


[sidebar]

VIRTUALIZING: STEPS TO TAKE

  • Configure virtual hosting platforms, guests (VMs) and networks properly.
  • Separate duties and deploy least privilege controls.
  • Integrate into lifecycle management.
  • Educate other groups, such as risk management and compliance.
  • Integrate with existing tools and policies.
  • Gain visibility across the enterprise.
  • Work with an “open ecosystem.”
  • Coordinate policies between VMs and network connections.
  • Consider the costs of virtualization, such as management and training.
  • Plan for user-installed VMs, a.k.a. desktop virtualization.

Source: SANS whitepaper, August 2009

An In-Depth Guide to Network Security

Get essential knowledge and practical strategies to fortify your network security.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.

You can skip this ad in 5 seconds