Even those techno-phobes who’ve been living with their head under a rock these last twenty years will have heard of virtual reality. It’s a recreation of reality in a virtual environment; one that doesn’t physically exist.
Virtualisation is the current buzz-word in information technology. It’s all about software and techniques to allow you to create and maintain virtual computers; machines that don’t physically exist, but run productive services as guests inside a host machine – usually a physical server.
The prevailing myth is that virtualisation is a technology more relevant to large organisations, hosters and outsourcing specialists with large data centres and massive computing needs. However, virtualisation is now at a stage where it is of value to smaller businesses. If you can cut through the forest of new jargon and buzzwords, it’s easier to see the benefits.
Looking at the obvious, at what point does running one big server cost less than running several? Leaving aside the cost of the rack space and the electricity, you’ve got the service and maintenance contracts, the support time in patching updating and maintaining several machines, then the capital outlay on the individual machines… If you could consolidate a dozen servers onto two or three, that’s a saving right there.
Let’s go further.
Supposing you can separate functions – email server, database server, fileserver, firewall, backup server, webserver, intranet server – to run on discrete virtual machines, which you currently you have fighting each other on a single box or else you keep buying more and more boxes.
Lets say you can spawn new instances of identically specified virtual machines in a matter of minutes. Better still, spawn new instances of different types of generic virtual machines – appliances – each of a certain spec with a pre-loaded base stack of software on it. Within minutes you can add your top layer of applications software or copy some data onto it. Play around with this virtual appliance, run tests, try out software upgrades, run a specific piece of work – say a segregated database clean up, a marketing campaign, or a software development – take a snapshot or backup the data then close it down.
Why not clone a production machine – say for testing with a live environment, or to increase capacity in production? Change the identifiers, clear or load data, promote it to live and there you go.
As long as you have the capacity in a few large physical servers, there’s nothing to stop you having a dedicated virtual machine for any purpose you want.
If you can separate the processing from the data from the operating system, you just simplified your trouble-shooting and maintenance.
Virtual machines reduce a complex backup process to a few files – large files, perhaps, but few. Now that you can separate the task from the hardware, you can get rid of those old boxes running a single legacy application by converting them to virtual machines.
The really neat part of all this? All the virtual machines think they are real! The virtualisation software provides a virtual hardware layer they sit on.
This also means you can create a properly industrial strength IT environment that would otherwise cost an arm and a leg in dedicated hardware, racking, electricity, air conditioning and so on. You can run proper security around your virtual network. You can have built-in redundancy if any virtual machines breakdown; start a new instance, load up a snapshot, copy in the latest data, off you go.
Take care of your reduced number of physical servers and you have a set-up that’s as reliable, if not more than, the old mish-mash of legacy kit.
How do you think virtual servers work within ISP’s and Cloud services these days? Virtualisation is the thing that makes the boom in computing a cost-effective reality. Now here’s a development; if you’re internet connection to the outside world is fast enough, you can outsource the physical maintenance of boxes to a third party and run your virtual machines on those. Cut out a whole chunk of IT, get it off your premises and into the hands of a supplier that has the staff and the training to keep it all running. Look after as much or as little as you can comfortably accommodate.
Virtualisation used to be dominated by a handful of specialist players up until a couple of years ago, principal of which as VMware with VMware vSphere; VMware Server, VMworkstation and VMplayer existing under various free licences) and Citrix with its’ Xen Server. Microsoft entered the space with Hyper-V, running on a familiar Windows interface and actually brought the cost down. At the desktop level, Microsoft Virtual PC enabled small-scale use of virtual machines by anyone.
Licensing for software on individual virtual machines remains a cost component; just because the instance is virtual, doesn’t mean a free lunch. You’ll still have to pay a license fee for any commerical software you run on your virtual machines, although license agreements are being updated to reflect the new way of working. It may not be as bad (as in, expensive) as a one-to-one charge. Add to that a number of free, open-source virtualisation tools – VirtualBox, QEMUand KVM (take a look at the How-to page) among them.
This has been a quick introduction to what is a rapidly expanding technology, with just a few ideas of the uses and benefits. There are plenty of resources for further reading, not only with the suppliers and project sites mentioned, but dedicated news sites and white papers. I’ll just finish by saying as a working hack, having the ability to fire up virtual machines to test this and that and run the legacy accounts software in isolation is immensely useful. AJS