Upper Canada
HOMELOGINMEETINGS & EVENTSPUBLICATIONS & NEWS
If you already have an account
you can login here.

By Category

BCS Career Profiles
Business & Strategy
Commentary
News

By Author

Alvares, Desmond
Boesche, Frank
Lahodynskyj, L.J.
Pevzner, Elena

Business & Strategy / Virtualization, ROI or Opportunity ?
This year in September I attended IDC’s Virtualization Forum as panel expert. The subject of this forum was the “ROI of Virtualization�. Having been dealing with this subject for the last 2 years on an ongoing basis I found it a good opportunity to share some of my experiences and help organisations finding the right approach. While many of the question were of technical nature – “how do to operate the environment?� “How to you plan for capacity?� – some questions allowed the focus on how to get there and why to “Virtualize�.
While being there I, of course,attended many of the presentations throughout the day. Not only to see how my contribution could add value to the overall content of the event but also to get some new ideas. The major key messages I had seen were

  • Cope with limited budget

  • Build centralized infrastructure

Of course, and that is the idea of such events, the subject of virtualization does not mean much or the same to everyone. So lets look into it

What really is Virtualization ?

Per Wikipedia the definition of Virtualization is:
�In computing, virtualization is the process of presenting a logical grouping or subset of computing resources so that they can be accessed in ways that give benefits over the original configuration. This new virtual view of the resources is not restricted by the implementation, geographic location or the physical configuration of underlying resources. Commonly virtualized resources include computing power and data storage.
A new trend in virtualization is the concept of a virtualization engine, which gives an overall holistic view of the entire network infrastructure by using the aggregation technique. Another popular kind of virtualization, and currently very referred to in the media, is the hardware virtualization for running more than one operating system at the same time, via nanokernels or Hardware Abstraction Layers, such as Xen.
�
(Wikipedia on: Virtualization)
In simple terms Hardware Virtualization involves the emulation of hardware components and the ability to make an operating system
believe it is the REAL THING. However, there are many more types of Virtualization
  • Storage Virtualization
    A technology to superimpose a (theroretical) unlimited number of storage resources over shared physical storage infrastructure, i.e. SAN (Storage Area Network). An example of such an environment is something I deal with quite often these days, namely the provisioning of “virtual� File Servers through NAS (Network Attached Storage) technology, META-LUN’s, iSCSI LUN virtualization etc.

  • Desktop Virtualization
    Also associated with hardware virtualization whereas Desktop operating systems are installed onto emulated hardware.
    However, another form of desktop virtualization is provided by Thin Client technology, hereby users accessing a “remote� desktop on a multi-user operating systems. Today such OS (Operating System) would be Microsoft Windows in Terminal Server Mode. I will be writing about this subject in a future article.

  • Network Virtualization
    first available through Cisco Systems VLAN technology allowing logical segmentation of network infrastructure on the MAC (Media Access Control) layer.

  • Security Layer Virtualization
    one form being the ability to provide “virtual� Firewalls hereby cost effectively creating a network architecture which not only increases the level of security within a corporate or service provider infrastructure but also allows a business to establish a TIERED, Service Oriented Architecture (SOA) in network infrastructure terms. Another subject I have been heavily involved in.

  • Network Name Virtualization
    commonly known as Clustering and Network Load Balancing

  • Last but not least Application Virtualization
    existing in various forms such as multiple-instance application or database services, application partitioning and hosting partitioning often known as “Virtual Private Servers� ISP offering supported by Virtuozzo Operating System Virtualization technology. Even Thin Client Application Publishing can fall under this category.

Back to the original subject of Hardware Virtualization.
The technology is on the market for quite some time, actually since the dawn of the mainframe and originally known as “Hardware Partitioning�. Emulation of hardware components (Hardware Virtualization) on the Windows platform was first developed by VMware – now an EMC company. Microsoft followed suit with the acquisition of Connectix. Connectix started with Virtual PC for MacOS then ported the product to Windows. Microsoft moved into a competing position by pushing an evolution from Virtual PC to Virtual Server (the desktop heritage still persists in the product).
A system comprising emulated hardware is known as Virtual Machine or VM. This however is not to be confused with a Java Virtual Machine or similar technologies. Having said that there are similarities in the concept as in the provisioning of an isolated environment for the purposes of running software.

Bringing back the focus on why I attended this forum, what messages I heard and - of course - what my thoughts where.

Various messages conveyed…


Why attending this forum:
Simple answer. I know about and have played with virtualization for quite a few years. I have attended events with similar subjects in the past. More importantly, I drove Virtualization as a strategy since joining PwC. What this entails I will explain in detail.
I was also asked to share my experiences, the successes and some advise with IT organisations.

I mentioned key messages learned from the overall event. What other messages came across?
  • VMware still has a strong market share. ESX is pre-dominantly being implemented.

  • No particularly significant implementation of Microsoft Virtual Server

  • Spend your money wisely. Many might agree that 17k for a single underpowered server (3GB RAM), a single point of failure and migrating from multi-purpose physical to multi-purpose virtual still has to prove ROI.
All cases had one common denominator, CENTRALIZATION of server infrastructure
  • Operations and standardization clearly benefited from virtualization
    Server imaging became more efficient

  • The processes around Disaster Recovery improved due to the simple fact that Virtual Infrastructure was easier and faster to restore

  • Hardware cost decreased. Licensing cost – of course – not.

All in all it was a very interesting and useful event. Of course not much new revelation for me but still the fact that IT people from
all levels of management, administration and support could share their views and experiences during breaks as well as lunch sessions (of which one I had the opportunity to chair, the subject being: How to manage Virtual Infrastructure).

So what knowledge did I pass on? Before I come to that let me give you some background.

The Strategy


I just joined PwC and noticed Virtualization had been on the table for quite some time..on a “Whiteboard�, in an inconspicious place. Furthermore, it was not really identified as “Virtualization� but, the reader might guess…yes, correct, “Server Consolidation�.
When I took on this, quite abandoned, subject matter - in October 2004 - I began by finding out why it was there in the first place.

A number of facts:
  • We leased servers. Very tax efficient, like leasing cars this is a full write-off. Furthermore it is a widely known as cost effective path towards technology refresh. So where lies the problem?
    Answer: The amount of server leases to be managed. Over 100 servers a year. The process of managing a lease replacement is quite tedious as it requires
    • Planning of decommissioning, commissioning and the transition

    • Rebuilding the original server configuration

    • The hassles associated with compatibility issues on newer platforms, especially for legacy applications

    • Complex migrations where original distribution files are missing

    • Time and effort

    • Lost support knowledge due to staff turnover

    • Paperwork pileup and resulting overtime associated with Asset Management
  • Server maintenance. There were some servers that were owned. All of those servers were outdated and expensive in maintenance costs. Very Expensive, Unlike a well working car (like the old Volvos and VW’s).

  • A large part of our server infrastructure was distributed throughout our offices in the entire country (Canada).

  • For historic reasons server builds differed from office to office

  • A large number of servers where underutilized

Here is where Virtualization came into place. My strategy was based on these and further elements essentially resulting into 2 streams

  • Consolidate underutilized servers, mostly infrastructure utilities, while maintaining high-availability in the Data Center

  • Consolidate and standardize server roles in all offices, creating our “Virtual Office Infrastructureâ€�

For the Data Center I envisioned VMware ESX and Vmotion for the offices Microsoft Virtual Server 2005 Release 1 (current release at the time).
While working on the strategy I gained in-depth experience with a production version of Virtual Server (fresh from Microsoft) and got to know all its Strengths and Weaknesses. In fact during that time I had the opportunity to pass my mindshare on to Microsoft’s Kurt Schmucker, Virtual Server Program Manager and former Connectix executive.

Moving forward…


Strategy was followed by POC (Proof of Concept) followed by Pilot followed by deployment. Before going to deployment I spend intense hours developing the overall deployment plan, preparing and hosting pre-deployment workshops and finalized a production release of a Rapid-Deployment & Support Toolkit (I started developing this toolkit during the Strategy/Lab phase).
It started with 1 individual (the reader may guess who); The POC phase involved already 2, Me plus the poor chap who had to do all the migration-work.
He was in a particular bad situation, as he had to balance his project efforts with due operational workload (Thanks Oleg).
The national deployment grew eventually to a 60 staff effort.
From Strategist to technical lead and Architect to PM to Trainer to Deployment Manager I had to balance all the different activities in different geographical locations and keep taps on various technical issues including networking, migration, rollout and server swap. Yes, server swap. One of the challenges and possibly PM masterpieces was to coordinate the rollout in a way that we could leverage brand new server hardware already deployed in offices and deemed Virtualization Hosts. Due to some history we had this hardware deployed as domain controller replacement (see lease-ends).
On this note I should add that the strategy included various specifications for servers deemed to run Virtualization software (published as Best-Practice to PwC global forums). The characteristics in questions were 1U Dell servers with 4GB of RAM.
The rollout plan was quite similar to a complex strategy game (for the geeks under us) in which you have to find the only solution to achieve a ship/return-waste-nothing/loose-nothing approach. At the end we achieved the rollout of 65 servers to 21 offices plus 21 benchmarking stations plus 21 hosting servers. In total we deployed 107 systems, all this within 2 ½ months. In fact the ambitious goal was to deploy within 1 ½ months. Still comparably impressive.
The benchmarking idea was developed by our client support team, we shipped out Windows XP based Virtual Machines which conduct objective benchmark tasks.
Not only did we migrate server roles manually but we also restructured from multi-purpose to single-purpose server environments. Naturally this involved manual steps, which in fact was a faster and more established approach than the one involving commercial migration tools. Furthermore, wherever we had Windows 2000 we now had Windows 2003. So, Migration and Upgrade at the same time.
The big win: dedicated servers make lockdown and delegation more efficient. Naturally we operate an Active Directory infrastructure and GPO (Group Policy Objects) based on server role as well as management delegation was no issue anymore.
We also adapted our backup approach, which in turn supported the envisioned disaster recovery methodology.

The future of IT…Utility…


What’s next. Well while working on all this I realized early on – during the strategy development – that there is more to Virtualization than a mere operational approach. Initially considered to be suitable for development and lab systems Virtualization was an excellent vehicle for provisioning and delivery. In addition there was a growing need by development teams to jump onto the Virtualization bandwagon.
So, in Summer 2005, I went about to develop a Business Model which describes Virtualization as the foundation for a new service to provide systems “On-Demand� or, in other words, introduce “IT Utility�. In order to get this done I leveraged my past experiences from service provider and systems integrator environments and mapped it to our situation.
Based on this business model we started to build a hosting farm, which within the first 2-3 months of launch delivered a value of around $200,000. Did I say value? Did I say delivery? Yes indeed, a new aspect to IT Operations and IS departments.
In fact not quite so new. The term “IT as a Business� has been around long enough, in fact since the 1970’s when ITIL (Information Technology Infrastructure Library) was first developed. The shift from an operational, cost focussed IT operation to a Service Oriented paradigm has been subject of industry discussion for a few years now, especially in the dawn of outsourcing and offshore IT.
We indeed managed successfully to use a platform, which did not seem to be designed for this purpose at all: Microsoft Virtual Server. I leveraged the power of 64bit technology in order to maximise performance and capacity of our platforms (Release 2 was finally available and supported 64bit, hurray).
Our farm is growing and what was designed for Rapid-Deployment for “Virtual Office� is now the core delivery system for our
IT Utility services. Our infrastructure supports now more than 200 virtualized systems. Business is Good.

Some wisdom


Now, this little piece of history should have given the reader some of the messages.
Going back to IDC’s forum, some of the wisdom I passed on was

  • Think strategy first
    Evaluate why you are doing it (know your goals), where it helps our organisation and the business as a whole, find the opportunities, think
    long-term. Look at achieving cost reduction; server maintenance is one of many “opportunities�.

  • Structure
    Virtualization allows you to create an infrastructure by the book, efficiently maintained, managed and secured

  • Think Business
    In order to get buy-in from management and the business walk backwards. Do not try to find justifications, which “might� convince the business, instead directly target the business goals. Understand where the business is going and what it needs to achieve its goals. Then identify where the Virtualization as technology and approach fit into those requirements.

To make this "body of knowledge" complete and as closure, I advise to also consider the following for any courageos move into the "Virtual" World

  • Phased Approach
    When developing your goals think about working in phases. Some organisations have limited manpower, so prioritising achievements is of the essence.

  • Dedication & Motivation
    Have a dedicated team and make sure that tasks and responsibilities are clearly defined. Educate and motivate your staff as this will greatly work towards your goals.

  • Know-How
    Know the technology (Virtualization) and how it fits into your operational environment. Adapt the technology and make it work.
    Ingenuity and Innovation are the key.

  • Business as Usual
    Change your perspective but not your methods. Management and technical staff often do not realize that in fact very little changes in technical terms. Essentially you only replace REAL hardware with the VIRTUAL equivalent. Your normal operational processes and management techniques should not change.

  • Educate, Educate, Educate

  • And finally…don’t rush into it. An unplanned and unprepared move into a “Virtual Worldâ€� will, with certainty, generate more headaches to catch up in the aftermath. Neither be complacent. The old “wisdomâ€� of “don’t fix if not brokenâ€� has never been of benefit. If Virtualization can improve your environment pursue it with consequence. Often enough, simply migrating old systems means also dragging on with old mistakes using new methods. In that case consider re-structuring technology.

About the Author
Frank Boesche is Manager with PricewaterhouseCoopers Canada and responsible for IT strategy and enterprise end-to-end solutions focussing on business/technology alignment. He has been driving Virtualization since 2004 and pioneered Utility Computing and 64bit technology on this platform. Mr. Boesche leads currently a national Thin Client/Utility initiative and is active advisor to that industry.
You can contact him for further questions and feedback under frank.boesche@ca.pwc.com.
F. Boesche
Last updated: 

Disclaimer

Content posted by individual members is subject to ownership and sole responsibility of the author and are protected by Copyright law. Opinions and views expressed here are those of the author and do not represent any view or opinions of the British Computer Society or its subsideries.

Headlines

Virtualization, ROI or Opportunity ?
This year in September I attended IDC’s Virtualization Forum as ...[more]