Gerald Haigh asks what is virtualisation, how does it work, and why are so many network managers talking about it?

Taking the ‘what’ question first, it’s a way of drastically reducing the number of network servers that are needed to run a Microsoft-based school network. So a big school might have a couple of dozen servers, each doing a different job. Virtualisation will make it possible to replace them with perhaps nine, or even fewer. How? By replacing many of the physical servers with virtual servers – that is to say they exist as software rather than as big metal boxes. The virtual servers are collected together into clusters and each cluster lives in a powerful physical server.

And why are so many organisations – not just schools – going down that road? For two main reasons.

School networkFirst, a virtualised server system, provided it’s properly done, is more efficient and reliable, and second it costs less. In fact it can be spectacularly cheaper both to install and to run. At West Hatch School in Essex, for example, currently moving to a virtualised system, the expectation is that instead of replacing 3 of the school’s 24 servers each year at £3,000 each -- £9000 annually in all – they will need only to spend £3,000 every three years for replacements in the virtualised environment. With fewer servers running there are also significant energy savings – West Hatch estimates they’ll reduce by a third the £12,000 it currently costs to run the servers.

It’s a story that’s being repeated by the country. Among other examples, Lodge Park Technology College in Northamptonshire, reducing from 20 servers to 6 will save £6,000 to £10,000 a year on hardware and a huge chunk from the energy bill. And at Wootton Bassett school, the expected saving is a massive £47,744 in the first year and £23,744 every year after. Which, as they’re quick to point out, is the equivalent of an NQT every year.

But why has all this suddenly come into view? The answer lies in the arrival of Microsoft’s Windows Server 2008 R2 Hyper-V. Without going into the technicalities, let’s just say that although virtualisation’s been around for some time, it’s Hyper-V that makes it cheap and relatively easy for schools to do, a facility that’s being rapidly exploited by network managers who are at this moment finding their way with it, blogging, tweeting and meeting to share their experiences.

What’s very striking is that they’re every bit as interested in the improved service they get from the new system as they are in the cost savings. Alan Richards, information systems manager at West Hatch School, interviewed for a yet-to-be-published Microsoft case study says: “It’s obviously nice to save that money but the main reason for the change is to ensure reliability and sustainability for the school.”

The virtual system was only built after overhaul of network and introduction of wireless

The background here is that Alan, arriving at West Hatch in 2008, inherited a network that was unreliable and consequently underused. In common with the networks in many schools it had been added to over the years with switches and hubs, difficult to track and maintain, spread around the campus. Since then he’s overseen a complete renewal of the network, with a managed wireless solution. Only after that has he built the virtualised server system.

The key to the virtualised system’s improved reliability lies in the way it deals with failure. Any physical server, in the best of systems, can and will fail occasionally. Usually it takes with it the applications it provides. In a virtualised system, however, if a physical server fails it automatically and seamlessly moves all its services to another server. Nobody out there in the school even knows it’s happened. In the trade it’s called “failover”, and in a school that’s been beset by network frustrations it’s the killer application that restores faith in the use of ICT.

So, is it all good news? Or is there a ‘but’ in this story anywhere? Only insofar as there’s a real need to approach virtualisation very carefully and unhurriedly. Alan Richards spent a year on the planning of his project, including running a longterm test with one server, and also monitoring and measuring existing network use over a considerable period of time.

The basic need is to get the number of physical servers right, together with the way the virtual servers are allocated between them. The twin aims are expandability, to accommodate rapidly growing ICT use, and redundancy, so that there’s room on any server to accommodate failovers when another server goes down. The upside of this is that it provides the opportunity to rethink the whole system, maybe, as at West Hatch, replacing one SharePoint server with two virtual versions so as to accommodate expected future expansion.

Alan’s work on virtualisation at West Hatch is impressive in its attention to detail and clarity of thought. The result is going to be what he’s aimed at all the way through – a first class, reliable ICT infrastructure ready for 21st Century learning.

More information

Server virtualisation appears on numerous postings on Ray Fleming’s Microsoft Schools Blog (search “server virtualisation”)
Alan Richards has several posts on Hyper-V virtualisation on the Learning Gateway blog he shares. If you want the techie stuff you’ll find it there.
Microsoft web page on "Virtualization with Hyper-V"

Gerald HaighGerald Haigh is an educator, freelance journalist writer, and expert on school management systems. He has a regular column focused on school capital projects like Building Schools for the Future on the National College's Future website.
You can contact him at This email address is being protected from spambots. You need JavaScript enabled to view it.


Add comment


Security code
Refresh