As school administrators consider ways to make technology more accessible to more students–and as broadband networks continue to play a larger role in the delivery of everyday instruction–information technology (IT) experts contend a new “game-changing” technology is poised to alter the paradigm of software administration and IT management in schools, shaking up a market notorious for its resistance to change.
At its most basic level, the concept, known as “software virtualization,” allows network administrators to run software applications previously dependent upon a specific hardware platform–such as Microsoft Windows or Apple Computer’s Macintosh operating system (OS)–on network-connected devices that do not support the native OS. In other words, a Windows-based machine might be able to support a software application designed specifically for use on an Apple computer or a Linux-based machine, and vice versa.
By “virtualizing” these software applications, network technicians also are able to beam them out across the entire network, cutting down on the time it would take to install copies of these software programs individually and giving technicians a means to conduct maintenance and troubleshooting from a single, centralized location.
Advocates of the movement say software virtualization solutions, or SVS, as they are commonly called, will increase the longevity of aging hardware devices, put an end to platform-specific software applications, and potentially shift the focus of the entire software industry from a licensing model to a more subscription-based approach–all of which would greatly simplify IT management.
Though the ability to beam individual software applications from a centralized server to an individual desktop is nothing new–companies such as Sun Microsystems have been pushing the benefits of these types of solutions in schools for years–network administrators say the process of software virtualization is revolutionary because it represents the first time users have been able to create fully functional “virtual” copies of both software applications and the operating systems on which these programs run.
The concept, though still in its infancy, has some technology engineers envisioning a day in the not-too-distant future when computers–desktop and laptop machines alike–will serve as simple conduits, or blank slates, with the indiscriminate ability to run any type of software application on any operating system, regardless of manufacturer.
“There’s really a battle royal brewing,” said Jeffrey Hibbard, vice president of marketing for Ardence Inc., a Massachusetts-based maker of software virtualization solutions. Hibbard says the idea that schools, corporations, and other users of enterprise-size computer networks eventually could send different software applications out across their systems to remote devices–regardless of platform, manufacturer, or operating system–will revolutionize the relationship between hardware and software applications.
Thanks to software virtualization, Hibbard says, the day will come when complex hardware devices will be seen more as simple household appliances–like television sets or stereo systems–with no software stored on the machine itself. Instead, all of the applications will be streamed on demand to the machine from a central location, or server farm, connected to the network.
“This is really such a game-changing technology, the internet is really going to kick into high gear,” said Hibbard. “This is kind of like the industrial revolution brought on by the steam engine.”
So what, exactly, is software virtualization? It is defined broadly by Brian Grammage of IT consulting firm Gartner Inc. as “the decoupling of different layers of the [software] stack so they are no longer dependent on the configuration specifics of the layer below.”
The term “stack” refers to a set of vertically aligned protocols, each dependent upon the preceding set in order to function properly. The term also refers to the software and hardware that run those protocols, which function in a manner that is similarly dependent upon those layers beneath them. The protocols, software, and hardware are dependent upon each other to make the machine function properly.
Despite a vaunted resistance to change, software executives who spoke with eSchool News contend the education market already is showing signs of a trend toward virtualization.
Ardence, for example, now markets a solution that locates multiple operating systems and applications on a centralized server or system of servers and then streams the applications across the network to the end user.
“Rather than embracing the complexity [of the interdependent relationship between software and hardware], we eliminate it,” Hibbard explained.
Traditionally, applications and OS data are run and stored at the level of the individual user. This means most software must be installed by an IT administrator at every single desktop, a task that often consumes countless work hours. The same can be said for troubleshooting. Viruses and malignant programs also usually are introduced at the individual user level as well, Grammage said. Once a virus takes hold on one user’s system, he explained, it can infect systems across the entire network, causing serious problems for network administrators.
Even in typical thin-client solutions, where applications are stored and run at the server level and delivered to the desktop over a network, you still need a web browser to access them–which means you need an operating system on the client machine.
In a thin-client model, Hibbard explained, “you build up a server farm at a central site, and distant schools get access to a server farm from a browser. Normally, in order to have that browser, you have to have an operating [system] on that machine. If you have servers to maintain, you still have to visit the desktop to service that device.”
Contrast that to Ardence’s approach, Hibbard said, where the PC running the Ardence solution operates like an appliance–such as a television set–with no other software stored on the machine itself, including the operating system. That means all management, troubleshooting, and upkeep truly can be done from a centralized location–the school system’s technology hub, for instance.
Christopher Fox, network administrator for the Bethel Park School District outside Pittsburgh, Pa., uses Ardence’s server solution.
Fox said it used to take his IT team several hours to reset PC configurations using Symantec Corp.’s Ghost software, which enable technicians to copy a PC’s configuration and install that same configuration on other machines. Technicians then can create a master image and distribute that image across the entire network.
But using Ghost software had its drawbacks, Fox said. For one, it could slow traffic on the network, depending upon bandwidth–imagine copying 2 gigabytes worth of information 100 times and sending that information across a network of several hundred users. Plus, the information from the Ghost program is actually stored on the local PC, meaning any problems that arose had to be dealt with personally by an IT administrator.
Now, with Ardence, Fox said, the same task can be done in 20 minutes.
“The PCs with Ardence are much more dependable than what we had previously,” he said. “Our trouble tickets are down 80 percent in the areas where we installed Ardence, compared with those same areas a year ago.” (See story: http://eschoolnews.com/news/showStory.cfm?ArticleID=5754.)
Another issue is compatibility. In the current networked environment, for example, one user might be running a Linux OS, and another could be running Mac OS X Tiger. But what happens when either or both of these users wants to run a software application built to run on the Windows platform? Under the traditional paradigm, where software applications are dependent upon an operating system, such a conversion would be difficult, if not impossible.
But proponents of software virtualization contend that’s no longer the case–not in their world. Software virtualization, they say, actually permits the scalable, cross-platform use of applications on a single machine.
One such product offered by Altiris Inc., a provider of IT lifecycle management solutions, reportedly permits an administrator to activate, deactivate, or reset desktop applications instantly, and to completely avoid conflicts between applications without affecting the base OS.
What this means, among other things, is that the Altiris virtualization solution can run applications that traditionally required different operating systems from a single machine simply by layering a version of the required OS on top of them in the software stack.
“The solution is good for schools with labs that have a couple dozen computers, but it can also be used to quickly re-image hundreds of thousands of computers on the university level,” said Rhett Glauser, a spokesman for Altiris.
Rich Bentley, also from Altiris, said his company has another solution to protect against the network corruption that can result from an individual user introducing malignant files from his or her own personal machine onto the network. Called Protect, the solution works by virtualizing individual user sessions.
“Say I’ve got a machine in the school lab, a lab with a thousand machines. Different people are logging on and off those machines all the time. Protect virtualizes everything a user does,” explained Bentley. “After a session, the system wipes away what they’ve done and keeps it from getting corrupted. It does not emulate the entire OS, but it protects the network.”
Bill Washburn, system administrator for California State University, San Marcos, uses the Altiris system to manage approximately 80 applications running on more than 1,000 computers used by about 7,000 students and faculty. Washburn said his experience with the software has increased security and helped extend his IT budget.
“We’ve rolled out Altiris to help significantly improve application reliability, drive down application support costs, and better serve the constantly evolving software requirements of our faculty and students,” Washburn said.
“We operate on a tight budget and have limited resources to address application management,” he explained. “SVS allows us to immediately fill application requests and be comfortable that we won’t break existing apps or create new problems.”
Perhaps the biggest advantage, he said, is that “SVS allows me to turn applications on and off when I want to.”
The future of licensing
But even as the concept of software virtualization is gathering steam in schools, there remain several unanswered questions that could keep the trend from catching on as quickly as its advocates would like.
One concern, especially for software manufacturers, is the issue of licensing. Software companies such as Microsoft Corp. have made large fortunes from selling individual copies of licensed software applications to schools, businesses, and consumers. But what happens if the concept of virtualization suddenly allows a user to buy one copy of a piece of software and beam it out across the network to hundreds, even thousands, of individual users?
Analysts such as Gartner’s Grammage contend that software makers, including giants such as Microsoft, will have no choice but to adjust their licensing structure to meet the demands of a changing marketplace.
“Licensing software generally is a broken business model at the moment,” said Grammage. “Controlling licensing costs is a thorn in the side of most organizations. Machine-level, sealed, and well-managed footprints [of the given software] can generally be located on a target PC.”
Still, Grammage says, the licensing issue is bound to cause confusion in the short term. However, if software virtualization continues to catch on as expected, he added, a different pricing model will develop over time.
An even larger question for Microsoft, which has built an entire empire around its desktop OS monopoly, is this: What impact will software virtualization–which frees applications from dependence on a specific OS–have on the company’s future?
An indication of how concerned Microsoft is with this question might be the company’s purchase of Virtual PC, a virtualization technology that works at the level of the operating system, from Connectix Corp. nearly three years ago (see story: http://www.eschoolnews.com/news/showStory.cfm?ArticleID=4297). Some view the move as a way for Microsoft to claim some sort of stake in the software virtualization market–before it renders Microsoft’s Windows irrelevant at the user level.
A spokesperson for Microsoft refused to say whether it has any plans to change its current licensing structure in response to the rise of software virtualization, though this spokesperson did tell an eSchool News reporter the company is committed to its own virtualization product, Virtual PC.
Virtual PC is a client-based software virtualization application that allows customers to simultaneously run multiple operating systems on a single PC. The MS Virtual Server permits the same kind of operating system interchangeability on the server level, allowing users to run multiple operating systems across a network.
“Beyond the release of Microsoft Virtual PC 2004,” the spokesperson said, “it is too early to discuss how Microsoft might deliver future virtualization software solutions.”
In press materials provided by Microsoft to more fully address the subject of virtualization, Brent Callinicos, Microsoft’s corporate vice president for worldwide licensing and pricing, said the company is making a number of changes to bring licensing policies in line with IT administration practices that are the result of virtualization.
“We are licensing by running instance, which is to say the number of images, installations, and/or copies of the original software stored on a local or storage network,” Callinicos wrote. “Instead of licensing every inactive or stored virtual instance of a Windows Server System product, customers can now create and store an unlimited number of instances, including those for back-up and recovery, and only pay for the maximum number of running instances at any given time.”
Callinicos said Microsoft also is looking for ways to provide easier software deployment across servers.
“Customers can now move active instances from one licensed server to another licensed server without limitation, as long as the physical server is licensed for that same product,” he wrote in the document.
“Customers can now stack multiple virtual instances on a machine by licensing for the number of virtual processors being used, rather than for all of the physical processors on the server,” he continued.
Regardless of what Microsoft and other software companies intend to do, Ardence’s Hibbard said he believes the market for virtualization solutions will only grow in coming years.
“The economics change so dramatically when the PC becomes an appliance, it creates new opportunities. We believe there are new competitors on the horizon for our technology,” he said.
See these related links:
California State University, San Marcos