What the heck is Virtualization 3.0?

The special report in the May 15, 2008, issue of SD Times is entitled “Virtualization 3.0: Forget server consolidation; virtualization has the buzz and new benefits for developers and QA teams.”

Virtualization 3.0 is a brand-new buzzword that we use at SD Times to indicate, naturally enough, a third wave of virtualization.

The first wave was niche tools for the desktop. As you may recall, one of the first uses of VMware Workstation (which came out in 1999) was to help developers create test-bed UIs which contained specific toolsets, or which could be designed to replicate particular clients’ runtime environments. Virtualization 1.0 also appeared in forms like Connectix’s Virtual PC, which let Macs run Windows, albeit slowly. (Microsoft acquired Virtual PC later on, and it was bundled with Office 2004 Professional for the Mac.)

This first wave focused on letting desktop operating systems do things that they couldn’t do otherwise. Originally, most virtualization programs of this sort ran super-slowly; they consumed many CPU cycles, and sucked up too much memory, and in a single-threaded hardware environment, they were painful to use. However, if you really needed to emulate a specific environment for software testing, or if you really needed to run Outlook on a Mac, it worked.

The second wave of virtualization was driven by servers, with the big payoff being server consolidation. That’s where the big boom came in. Forget improving end-user productivity: here, the goal was to save money on servers and power. The second wave is behind nearly all of today’s popular interest in virtualization by CTOs, CIOs and data center managers, and for good reason: it can save companies a huge amount of money.

What about the third wave? Our special report, written by Andrew Binstock, explored the premise that virtualization is returning to the desktop in a new way. It’s not just for personal productivity or for enabling Macs to run Windows better (as you can do with Parallels, for example). The rise of multi-core machines, and improvement in hypervisor technology, is turning virtualization into the newest secret weapon for developers and testers. Read the special report, and you’ll see what we mean.

In summary:

Virtualization 1.0: Emulation to let software run on incompatible desktop operating systems, or to create special-purpose test environments. Ran slowly, but got the job done.

Virtualization 2.0: Server consolidation. Saved money by improving utilization of CPUs and memory, and therefore requiring less hardware. Also eased deployment of software in the data center.

Virtualization 3.0: Improving software quality by giving developers and testers access to multiple virtual machines running at fast speeds, thanks to multicore.

Z Trek Copyright (c) Alan Zeichick
2 replies
  1. Matt
    Matt says:

    I, personally, am waiting on an ESX equivalent for the desktop. I’ve got various classes of users. Theoretically, with the proper login scripts, any of my operations users could login to a developer workstation, and vice versa, but both sets of tools would have to be installed locally, on every machine, or have network installs. It could get messy, license wise and software update wise.

    OR

    They could boot their ESX-type OS, log into it, and automagically have their machine, regardless of where they’re actually located. This is especially useful if you’ve got mixed Linux / Windows users like I do.

    It’s much easier to do backups if you have a single disk image, plus you have the capability of doing server-side snapshots.

    I think something like this may be the next “killer app”.

Comments are closed.