Jul 07 2008
Networking

Circling Back to Centralized Computing

John Welzenbach

It wasn’t so long ago that the mainframe ruled IT. Most of our computing work was done by big iron hooked to dumb terminals. Centralized computing was the standard because networks had yet to evolve that could share the processing between personal computers and the mainframe.

As the microprocessor and networks matured, distributed client/server became the de facto standard to support and empower a greater number of end users. Client/server systems relied heavily on the network to communicate requests and also made it possible to run different operating systems on the PC than on the server.

The client/server model has dominated for the past 15 years, but today the pendulum is swinging back to centralized computing and consolidation, for three reasons: cost, security and manageability.

Cost is a perennial concern, and security issues have probably never been more critical. The trend toward centralization, however, is being driven most by system manageability.

Consider this: Centralized data stores are much easier to manage, which makes it easier to share data across the enterprise with other organizations. That, combined with cost and security benefits, is the true driver in the move toward greater centralization.

“When things are going well, nobody’s necessarily thinking about how to do things better,” says Alan Shark, executive director of the Public Technology Institute in Washingon, D.C. “But we now have cities and counties (and even some states) trying to figure out if it would make sense to share their systems with somebody else or set up sort of a ‘consortium of equals’ in a central data facility.”

The trend is widespread and growing. One advantage of the centralized approach over a client/server model is stronger security: Data remains at the central facility and is not out in the field where it could be more easily accessed and compromised.

But it’s easy management, combined with lower cost, that really makes centralization so attractive. Hosting applications from a central location alleviates management complexity and reduces the total cost of ownership.

The city of Manor, Texas, is rolling out Wyse Technology thin clients in the city’s police cars to gain data-management advantages that can be realized only through central computing. “When our officers enter information on the thin clients, they’re actually entering it directly onto our network — onto our server — in real time,” says CIO Dustin Haisler. “That’s just a great benefit.”

Centralization isn’t a fad. PCs have traditionally been powerful, but the downside to that prolific deployment is the creation of silos.

IDC forecasts the worldwide thin client market to grow from 3.6 million units shipped in 2008 to just over 7 million units shipped in 2012 for a compound annual growth rate of 18.3 percent.

“There are IT standards we’ve always had, in any industry,” says Steve Jennings, CIO for Harris County, Texas. “No. 1 is cost. No. 2 is management. No. 3 is security. No. 4 is availability. And I don’t care whether you’re wired or wireless: A centralized approach will give you, probably, the best in those four areas. Overall, we can end up with the best of both worlds, control and security, but with the user retaining a full feeling of independence and flexibility.”

Centralized IT management makes sense from a number of perspectives. As we go forward, we can anticipate technological development that makes thin clients and central computing more and more attractive to the multifaceted organizational needs of the 21st century.

Jim Shanks is executive vice president and former CIO of CDW.

Close

Become an Insider

Unlock white papers, personalized recommendations and other premium content for an in-depth look at evolving IT