Jul 26 2007

Server Virtualization Success Hinges on Managing Mixed Environments

Balancing physical and virtual environments requires new tools and management techniques.

Server virtualization — the popular technique of running multiple operating system and application combinations on a single piece of hardware — can leave IT managers in a technology nether world.

Virtualization works fine for some applications, such as office automation programs that go for long periods without straining a central processing unit, and thus can share resources with similarly low-demand programs.

But other, transaction-intense applications, such as database servers that constantly require significant processing power, perform better when they have CPUs and memory reserved for themselves on a single, physical machine.

The result: IT departments often cope with two types of environments — the virtual and the physical, as well as the managerial problems this duality creates. “Dual environments are the norm for most organizations that have turned to some degree to virtualization,” says Andi Mann, senior analyst with Enterprise Management Associates. “No doubt about it, this makes management more difficult.”

In addition to making decisions about which applications are right for virtualization and which ones aren’t, IT managers also must grapple with load-balancing both types of servers, a task that often requires separate if somewhat redundant management utilities. Securing the two platforms also requires slight variations and new skills for the IT staff.

In the end, virtualization may solve more problems than it creates, but agencies shouldn’t be blind to the new challenges. “When people think of virtualization, they think there’s some kind of smoke and mirrors involved,” says Steve Ingersoll, systems administrator with the Nevada State Health Division (NSHD). “But whether you have 100 gigabytes’ worth of an operating system and applications on a hard drive or whether it’s on a virtualized server partition, you still have 100 GB that needs some kind of management.”

Navigation Issues

In the three years that the NSHD has been virtualizing some of its servers, Ingersoll has become well versed in the challenges of navigating the virtual and physical worlds.

Ingersoll’s staff once disagreed with the vendor of an electronic death registry application when the NSHD wanted to break up the program between virtual and physical servers.

“The vendor said, ‘That won’t work. We have one environment, and that’s the way you need to do it,’ ” Ingersoll recalls.

But after getting validation from an outside consultant, the NSHD proceeded with its dual-environment plan and has been pleased with the results. “We saved a considerable amount of server space and resources,” Ingersoll says. “And in the 12 months or so the application has been live, it’s worked extremely well in the virtualized environment.”

Software vendors aren’t the only skeptics. The NSHD received similar resistance from the Centers for Disease Control when the division sought to virtualize a CDC public health preparedness application that normally requires five separate devices for the application server, database management system, portal, and other components. The CDC balked at certifying the implementation, but Ingersoll says not only does virtualization work, but also the application’s performance is faster than it is when it runs on five physical servers. He says examples like these show that states and other public-sector organizations “are still trying to get on board when it comes to the virtual-server merry-go-round.”

Top Concerns

A recent study by Enterprise Management Associates found two main stress points. Security management requires strategies for protecting against vulnerabilities unique to virtualization. In addition to using the intrusion detection systems, firewalls, and other traditional tactics to harden physical servers, managers of virtual servers need to protect against unauthorized access to the hypervisor, the software that coordinates interactions between the operating systems and hardware resources in virtual environments.

“Hypervisor runs above the hardware and underneath the operating system,” Mann warns. “If someone gains access to the hypervisor, then conceivably they have access to all the virtualized servers as well as to the network and to storage devices.”

One answer is sHype, originally developed by IBM researchers and later submitted to the XenSource open source virtualization platform. The security tool uses mandatory access controls to enforce rules for sharing resources in a virtual environment.

A second management challenge for virtualization projects is resource utilization. Usage profiles help organizations answer basic questions about what applications are best for shared implementations. “How many servers should I virtualize? How many can I put on one blade in a blade system? Where am I running below my CPU peak load? The only way to understand your utilization patterns is to go out and study them,” Ingersoll says. “But most states don’t spend very much in the way of resources and time trying to understand what kind of loads they have on their servers — unless they have a problem.”

To perform this task, agencies may need a mix of tools for measuring CPU load, memory utilization and data throughput among their virtual and physical servers to assure accurate results, Mann says. “If you’ve got 100 percent CPU utilization in a virtual machine, that doesn’t always translate to 100 percent in a physical machine, so sometimes you can get misleading data on performance if you haven’t got the right tools,” he says.

One approach is to combine traditional hardware management applications for physical hardware with corresponding utilities from virtualization technology providers such asVMware and Microsoft. Mann hopes the x86 virtualization market will eventually offer tools that integrate physical and virtual management controls within a single management interface.

In the meantime, the city of Fontana, Calif., will continue to run overlapping tools as it progresses with its plan to consolidate servers using virtualization. “Our virtual servers will have simple network management protocol loaded on them for some management capabilities, but we will likely use a package from either our hardware or virtualization software vendor that’s specifically designed to monitor virtual environments,” says Chris Beck, network administrator.

Central Virtual Control

NSHD’s Ingersoll manages the division’s virtual environment with a virtualization interface (VI) from VMware. “If you are going to run an enterprise-class shop — I don’t care if you have two servers or you have 2,000 servers — you need a virtualization interface to administer all of those,” he says.

He gives high marks to his current VI, citing easy-to-use tools for analyzing virtual-server performance loads and utilization rates from within a central management console. “We can look at all of our virtual servers and make any necessary changes on each of them no matter where they sit.” The centralized management capabilities are an improvement over previous versions of the management tool that required manual intervention when problems arose. Ingersoll didn’t consider third-party tools for virtualization management because of what he feels are the reliability and economic benefits of relying on a single vendor. “Why pay for a separate product?” he reasons. “And what if [the third-party tool] and the virtualization software aren’t interoperable?”

For his physical servers, Ingersoll uses the Linux management program that comes with his Red Hat Linux operating system or Microsoft’s Windows Management Server for his Windows servers.

Ingersoll offers one other piece of managerial advice. “People need to understand that virtualization is not hard and scary. …Our experience is it’s a good choice if you are trying to save money, resources or administrative time,” he says.

Virtualization on the Rise

More and more public-sector agencies will be turning to server virtualization in the months ahead, according to industry research. A buying-trend survey at a recent public sector trade show found that 70 percent of the respondents expected to launch virtualization projects in the next year. Efficiency and cutting costs were cited as the main catalysts.

Researchers such as International Data Corporation say virtualization can obviate the problem of having expensive computing capacity sit idle for long periods between demand peaks. IDC estimates that servers in most large organizations typically operate at only 10 percent or less of their rated CPU capacity.

Consolidating applications and operating systems into single boxes or blade servers reduces this waste and can help organizations achieve utilization rates of 40 percent or higher to get more from their hardware investments, according to some studies. An added cost benefit results, because fewer physical servers mean reduced energy costs and less real estate needed for data centers.

Virtual Servers Yield Tangible Benefits

When managed correctly, virtual servers can benefit organizations in the following ways:

  • Increase in hardware utilization rates: 50 to 70 percent
  • Decrease in capital costs for hardware and software: 40 percent
  • Decrease in operations costs: 50 to 70 percent
  • Increased server-to-administrator ratios: 10:1 (physical servers), 30:1 (virtual servers)
Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT