Oct 31 2006
Data Center

Tech Focus

It's no longer an arcane mainframe strategy, but this resource-pooling strategy still requires care to succeed.

Virtualization Goes Mainstream
This resource-pooling strategy can lower costs and increase efficiency for government technology departments. However, virtualization requires careful handling in order to achieve success.

The state of Michigan is nothing if not ambitious. The state is revamping IT systems so large and complex any one of them could keep an IT shop busy for a year. But Michigan is simultaneously tackling six of them, ranging from a rewrite of the Health and Human Services management system to revising applications for Medicaid eligibility, tax collection and Department of Motor Vehicles registrations.

“There’s a plethora of very large-scale, high-dollar, high-impact projects, each affecting mission-critical systems,” says Patrick Hale, director of data center and technical services. With so much change, it’s essential for the state to create what Hale calls “highly dynamic” development and test environments to support the efforts.

And that’s where Michigan has a secret weapon. Hale’s team uses development and test servers created with virtualization software, a rapidly growing strategy borrowed from mainframes that spurns the classic architecture of pairing one server with one operating system, and instead collectively divides up and dynamically apportions a machine’s computing resources, including the central processing unit (CPU) and memory.

Done right, a single “virtualized” server can run multiple operating systems and applications side by side, so that users may believe they have an entire computer dedicated to their applications when, in fact, they may be sharing the box’s capabilities with many other people.

Virtualization allows Michigan to create a test center that mimics its production environment in size and complexity. “We can see how the rewritten systems will perform and be assured

that the lights aren’t going to dim when we put them into production,” Hale says.

ADOPTION SPIKE

Virtualization isn’t just a niche tool only for developers and mainframe jockeys. A recent survey by Yankee Group, a Boston-based research firm, concluded that virtualization has become “a mainstream technology,” with three out of four businesses surveyed saying they’re using or are planning to implement the strategy. “It’s become abundantly clear why users have embraced this technology — it’s easy to deploy and [organizations] can see immediate business and cost-saving benefits,” says Laura DiDio, a Yankee research fellow and co-author of the study.

Sales figures bear this out. The virtualization market grew nearly 70 percent last year, to more than $550 million, according to IDC, a market researcher in Framingham, Mass.

The technology is gaining in popularity because it can lower costs and increase IT efficiency in a number of areas. Delaware is using virtualization as part of a server consolidation effort that moved 30 departmental applications formerly run on individual servers into a new computer center built with compact blade servers.

Virtualization allows the state to run everything within the single blade center, which required an up-front investment of about $250,000. Dedicating three full-blown servers to each application would have cost about $360,000, or 44 percent more, estimates William Hickox, Delaware’s chief operating officer.

Virtualization can also help organizations get the most from their existing hardware investments. IDC estimates that many Linux and Windows servers run at an average of less than 15 percent of their CPU capacities, except during peak demand periods. That leaves most of the processor capacity idle much of the time. “Virtualizing” multiple applications and operating systems onto single servers may boost utilization rates to 30 percent, 40 percent or more, says John Humphreys, IDC’s program director for enterprise virtualization software.

Having more-compact data centers is another virtualization plus. Kane County, Ill., now uses four virtualized servers to handle what used to run on 45 separate machines. In the process, it shrank its 2,000-square-foot data center to 400 square feet. This allowed the county to replace two 80-ton air conditioners with one 40-ton unit. “Our AC reduction alone saved us almost a half a million bucks,” says David Siles, chief technical officer (CTO).

Nevada is using a virtualization pilot project to help it gauge potential reductions in IT administration demands. “The long-term goal is to move environments like the DNS [Domain Name System] and Active Directory server to a virtualized environment because it will save us from having to worry about hardware over the long term,” says Shawn Curby, manager of Internet services and servers for the state. “Down the road when we need to replace the hardware, we’ll set up the virtual instance on the new equipment, and we won’t have to rebuild the operating system or do any reconfiguration,” he explains.

Curby anticipates additional savings from fewer software licenses for Microsoft Windows, which allows for up to five operating system licenses per virtualized environment.

Added together, these financial benefits can turn into fast returns on investment. “We’ve seen folks who’ve dramatically lowered operational savings, so that a $300,000 investment returns $3 million to the organization over a three-year period,” IDC’s Humphreys says.

RED FLAGS

Despite its potential benefits, there’s no guarantee of virtualization success. IT officials preach prudence before implementing a wholesale virtualization effort.

First, IT managers need to determine which servers and applications are good virtualization candidates. Chris Marroquin, IT manager for the Michigan Department of Information Technology, calls the servers with low CPU utilization “low-hanging fruit.” However, he adds that large, resource-hungry applications, such as a Windows Exchange server with 10,000 users, need too much dedicated power to run in a shared virtualization environment.

Utilization-monitoring tools provided by virtualization software vendors can help with CPU evaluations, but they’re not essential. Kane County’s Siles says he monitored his servers for a couple of months using Simple Network Management Protocol (SNMP). “We spent a good amount of time baselining our environment,” he says. “You want to look for a candidate that’s using its processor between 5 percent and 25 percent during an average load and an application that’s generating moderate to low network traffic and disk [input and output].”

IDC’s Humphreys warns IT managers to be on guard for “server huggers,” department heads who insist on having their own physical hardware to host their applications. Efficiency can win over these skeptics. “They find that with virtual machines, they can get their needs met much quicker,” he says. “So instead of it taking a month to get that physical host up and running for them, in a lot of cases, the IT department can do it in an afternoon.”

The flipside of server consolidation is increased dependency on a smaller number of boxes. Uptime for those machines becomes critical. “We are acutely aware of that fact after a major migration of our messaging systems onto a single enterprise environment,” says Michigan’s Hale. Before, he adds, “If we lost the mail server, 200 or 300 users would be affected. Today, if we lose one of our centralized mail servers, thousands of people are affected.”

System redundancy becomes an essential safety valve. “The more critical the functions, the more redundancy we need,” Hale says. Michigan doesn’t think in terms of single virtualization computers, but instead envisions virtualization farms. “For every function that we add to our virtualization efforts we ask ourselves, ‘What’s the disaster recovery plan?’”

Finally, virtualization forces IT managers to rethink some of their practices. “[Virtualization] adds a layer of complexity to your environment that you have to be prepared to support and maintain,” Hale says.

That’s why Kane County moved slowly, first with pilot projects and then with “rolling upgrades,” says CTO Siles. As his department moved an application to a virtual server, the existing server and application combination remained available. “We’d bring the application over to the virtual environment and then route the traffic to the virtual server,” he explains. “If it didn’t work, we always had that physical server. Instead of trying to do everything in a weekend, we did it over the course of about six months to make sure our staff could administer it.”

I. T. TAKEAWAYS

VIRTUALIZATION BENEFITS

Fewer servers needed to run many core applications.

Higher CPU utilization rates of servers, from typical levels of about 10 percent to 40 percent or more of rated capacity.

Physical consolidation of data centers, reducing air-conditioning bills and freeing up office space.

Lower maintenance burdens when moving existing applications to replacement servers.

Fewer software licenses for software sold on a per-CPU basis.

VIRTUALIZATION TRAPS

Run performance baselines to determine which server CPUs are underutilized, making them good candidates for virtualization. Servers running high-demand applications requiring high CPU and memory resources aren’t likely to run well in a shared environment.

Prepare to tangle with “server huggers” who need to be persuaded that virtualized servers can be more efficient than traditional, dedicated servers.

Budget for system redundancy, since running multiple systems on a small number of servers boosts uptime requirements.

Implement virtualization slowly to give IT staff time to adjust.

Virtualization Isn’t Just for Servers

Dynamically segmenting computing resources to boost efficiency and flexibility isn’t a technique that applies only to servers. IT managers in Kane County, Ill., are applying a virtualization approach to servers, storage systems and desktop applications.

Virtualized storage creates one large pool of available resources so IT managers can quickly allocate space based on the changing needs of end users. Similarly, a virtualized desktop strategy networks stripped-down computers without their own CPUs and hard drives to virtualized servers in the data center. David Siles, chief technical officer, estimates he can run as many as 35 of these desktop devices from one four-processor server.

“I always thought that if we could run desktops in a virtualization environment, the cost savings would compound pretty quickly,” he says. “To me it’s no different than doing the server side of it. The only difference is when people are accessing an application or a service on the server, they are using terminal emulation.”

So far, the approach is working. About two hundred desktop systems are scheduled for a tech refresh next year, but Siles plans to replace less than half of them. The remaining users will receive thin clients costing only a couple of hundred dollars each. The economy units consist of a monitor with a small convection-cooled, diskless computer strapped to the back. “There are no moving parts, and the lifecycle goes from two to three years down to seven to 10 years,” he says. “They take three watts of power, so just in terms of energy, I’m exponentially saving all that kilowatt power, compared to desktops with 400-watt power supplies.”

The virtual desktops will also help the county meet regulatory requirements, especially the privacy rules of the Health Insurance Portability and Accountability Act (HIPAA). With the old decentralized system, some health-care data resided on central servers, while other files sat on desktop hard drives. Multiple storage places made it difficult for managers to assure backups happened according to HIPAA rules. Eliminating local storage eases that burden, Siles believes. “I’m not only backing up all the data, I’m also implementing access control for privacy,” he says. “Now I can say, ‘It’s 5 o’clock, and this desktop can no longer access the data on this server.’”

Alan Joch is a technology writer based in New Hampshire.

You can find this issue’s stories, plus past StateTech editorial content, at STATETECHMAG.com.

Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT