The Wayback Machine - https://web.archive.org/web/20160729182221/http://www.virtualcomputing.net/virtual-machines

Virtual Machines

Computers Inside Computers: Virtual Computing, Virtual Machines and VM Ware

            Our world is quickly becoming virtual. We buy goods with virtual currencies, we’re entertained by virtual reality gaming experiences, and much of our work is done by virtual assistants working a block, a country or a continent away.

            Perhaps no “virtual breakthrough” has been as important, however, as the development of virtual computing. It has changed the way that individuals, companies and governments work with, store and use data. In fact, it has made possible many of the 21st century’s other breakthroughs in virtual technology.

            At its simplest level, virtual computing is simply computers working “inside of” other computers. If that sounds somewhat confusing, here’s an explanation of what virtualization is and why it is so important.

Virtual Machines - Networking in Data Centers

The History of Virtual Computing

Personal computers have been a backbone of modern society for decades, but some readers may recall how different things were during the 1950s and 60s when mainframes ruled the world. One user at a time could perform their desired task, whether it was a single computation or a batch job. Users would reserve time – sometimes days in advance – to have computer operators feed their stack of punch cards or rolls of punched paper tape into the machine, and then would wait hours or days for their output to arrive.

            The obvious inefficiencies inherent in this approach, and the increased memory and speed of new computer iterations, led to the development and implementation of the “time-sharing” concept in the 1960s. Mainframes were programmed to recognize many different users at remote locations, keep track of each user’s status and jobs, and allow access to shared computer resources in the most efficient way. Time-sharing was originally created at and used by universities, but during the 1970s it became a big industry utilized by businesses and governmental agencies. It was also the first major example of widespread user interaction with the computers they used.

            The evolution of commercially-viable personal computers, from the earliest Atari, Sinclair and VIC-20 models through the PC and Mac era, rendered the time-sharing concept largely irrelevant for most users. However, the basic idea behind time-sharing was key to the development of a new and crucial computing concept: virtual computing.

Commodore VIC-20

            Traditional time-sharing allowed users to share memory and system resources with a single operating system running the computer. Virtual computing, first made available to the public in the early 70s, took the idea an important step further: each user had their own separate area of the mainframe, run by their own personal operating system with their own memory and storage – a “virtual computer” within the mainframe. If they crashed their applications, only their own “virtual computer” would go down rather than the entire machine. Computer resources could be allocated according to an account’s needs and requirements rather than having all users fighting for the same resources. And security was greatly enhanced, because each user only had access to their own section of the machine.

            “Virtualization” was used extensively by IBM in the 1970s but primarily in-house; it was available to customers who bought IBM mainframes at that point, but they were actually encouraged to use standard operating systems like DOS instead. Meanwhile, the 1980s breakthroughs which made affordable office and home computers widely available as replacements for large mainframes rendered the concept of virtual machines virtually moot throughout the decade. Widespread research and innovation into virtualization also lagged, because there wasn’t yet a simple way to run applications on virtual partitions.

            The landscape began to change in the late 80s and early 90s when a software emulator known as SoftPC was released. It allowed PC users to run DOS applications on a Unix platform installed on their machine. Later versions added compatibility for Windows applications and made the same capabilities available to Mac owners. Other companies realized the potential and began developing their own solutions, including Apple which marketed VirtualPC for their machines starting in 1997 – and two years later the firm which would become the true giant of virtualization, VMWare, began selling its virtual workstation products aimed primarily at personal computer users.

            Companies began to realize the vast economies of scale possible with IT departments servicing one central concentration of computer “servers” running large numbers of virtual machines. And the Internet’s rise in popularity created an entire new industry, computer “server farms” hosting hundreds of machines for thousands of clients needing to serve website content. At around the same time came the launch of what we now consider true virtual computing, VMWare’s release of two landmark virtualization products for computer servers in 2001.

ESX Server made it easy to run virtual machines on a host computer with full support in each virtual partition for non-native applications. GSX Server was even more groundbreaking because it allowed companies or individuals to run virtual machines “on top” of a host operating system like Windows which was already installed on a computer. Other firms, small and large, jumped into the market and virtual computing quickly became an established and important component of today’s computing environment.

Modern Virtual Computing and Virtual Machines

            The previous section looked at the general concept of virtual computing as it exists today. To explain it a bit more rigorously, however, virtual computing is the process of making one computer function as if it were many computers, with each of those virtual computers – or virtual machines – located “inside” the host computer.

            The ability to house virtual machines on a single computer is facilitated by the installation of software products which enable virtualization; these products have commonly become known as VMWare, although that’s actually a brand name and there are a number of products which perform the same function. Once the host computer has virtualization enabled, each user is given their own dedicated space on the machine, with a set amount of dedicated resources including processing power, memory and storage.

Each user simply has to connect to their virtual machine remotely via network or wireless server in order to install and run operating systems and software, access storage space and utilities, perform backups and do everything else they could do on a “real” machine located right in front of them. The process basically allows a small user to have one or more powerful computers working for them, without having to invest in expensive machines to provide all of the resources locally.

Conversely, the process allows large corporations or institutions to have centralized computer installations servicing an enormous number of employees or clients with virtual machines, eliminating the need to install powerful individual machines in numerous, widespread locations. And the use of VMs permits optimal assignment and use of computer resources to facilitate the most efficient use of each computer.

Mainframe Computer in Data Center can be Replaced by Virtual Machines, Emulation and Load Sharing

As virtual computing has become commonplace, many other uses for the concept have emerged. In the personal computing world, for example, it’s now easy to virtualize your own home computer as long as it has a fairly recent operating system and a decent amount of resources. Some people use that capability to set up a separate virtual machine running Windows on a Mac, giving them access to the full range of applications available for both platforms.

Others establish different virtual machines for different members of the household, separate VMs for personal and business use, and so on. It’s also become common for computer-savvy users to use virtual machines for surfing online or receiving email. This ensures that if they contract a virus or other malware, only their VM will be infected while their primary operating system and storage will be left intact.

Corporations and other major institutions have found virtual computing to be a godsend, both for operating efficiency and cost effectiveness. The managers of IT departments are able to run any number of virtual machines and applications on much smaller number of computer servers. They’re able to centralize information and records which can then be accessed by virtual users throughout a building, in offices in remote cities or countries, or even on the road by means of a laptop or tablet – allowing systemwide access to information and processing functions. And they’re able to save enormous amounts of money by reducing the number of powerful computers required throughout their company, as well as by cutting down on power and cooling bills.

Virtual computing has also become important for operators of websites and those who perform 24/7 functions in the online environment. Websites can be hosted on virtual machines located at a central server farm where all maintenance and monitoring functions can be consolidated, while those who have online programs churning 24 hours a day can have them running on a VM without affecting their own computing ability.

VMWare and Similar Products

VMware logo for Virtual Machines in Data Centers

            During our look at the history of virtual computing we mentioned that SoftPC and VirtualPC were early entrants in the virtualization software field, but the launch of VMWare’s ESX and GSX Server products in 2001 marked a turning point in the use of virtual machines. VMWare’s software has naturally been improved and rebranded over the years, and new products have been added, but despite competition VMWare remains the dominant force in the industry with more than half the current market share.

            The VMWare lineup includes:

  • ESX Server: It’s branded under the umbrella name of vSphere Server but is still around as VMWare’s flagship product, allowing servers to directly run virtual machines without a traditional additional operating system to control the hardware. However, it is now much more robust with supporting products allowing datacenters to easily manage the virtual machines which are installed. ESXi Server is another version which doesn’t require a console operating system.
     
  • VMWare Fusion: Similar to ESX Server, this version is designed for the Mac and allows Linux, Windows and other operating systems to run.
     
  • VMWare Server: A free product, this is the latest iteration of GSX Server which runs on either Linus or Windows machines.
     
  • VMWare Player: This is another free product which lets Windows users take a test run of VMWare before deciding to use the company’s products.
     
  • VMWare Workstation: This product runs on Linux, Unix and Windows and is primarily for administrators and technicians, allowing them to install a number of VMs running simultaneously on a single machine in order to test software compatibility.
     
  • Virtual Desktop Manager: Designed to run on “thin clients” (machines that don’t actually do any of the processing work), this product sets up a desktop VM environment for the actual virtual machine hosted on a server.

As you can tell, the wide variety of products can service anyone in need of software to create or run a virtual computing system.

            Competitors have been slowly eating into VMWare’s market dominance, with several poised to become serious players in the near future. They include:

  • Microsoft Hyper-V: The computer giant launched this product in 2008, fearing that VMWare would eventually lead customers to abandon Windows platforms. Hyper-V competes primarily with vSphere (ESX Server) but is nowhere as full-featured. It is, however, less expensive. And Microsoft has leveraged widespread usage of its other server software to gain market share by giving Hyper-V away for free with those server products, making it the competitor most likely to challenge VMWare going forward.
     
  • XenServer: Produced by Citrix, XenServer is free software which doesn’t require a native operating system to create and operate virtual machines on a computer. To date, it’s not a big player in the market but the fact that users don’t have to pay a licensing charge (as they do with vSphere, for example) may give it a push going forward.
     
  • Other Products: Major industry companies like Oracle (VirtualBox) and Red Hat (REVH) are also mentioned as potential challengers to VMWare, but the fact that their products are aimed primarily at advanced users who can customize code makes it unlikely that they’ll make a major dent in VMWare’s market share.

Downsides of Virtual Computing

            Every advancement in technology has its downsides, and virtual computing has several. IT and server company staff face a greatly-increased amount of work monitoring and managing computers hosting many virtual machines, and security is a much bigger issue when there are so many users accessing a computer remotely. There are also questions being discussed between producers about software fees, since one computer with fifteen VMs may be running fifteen different applications of the same software product – does that require one software licensing fee, fifteen, or something in-between?

            Despite these issues and questions, virtual computing is here to stay. In fact, advances such as cloud computing and “grid computing” (tying data centers together into one monster computing grid) clearly show that virtualization will continue to grow in importance, in many different forms.

 

-Patrick Oliphant, 06 June, 2012