History of Virtualization

        Everyone is doing it but when exactly did it start?

        As a technology, virtualization is not so “new”. It’s about 36 year old and was first developed in the 1960s to partition large, mainframe hardware for better hardware utilization. IBM found a way to logically partition mainframe computers into seperate virtual machines. Virtualization allowed mainframes to multitask multiple applications and processes at the same time.

        In 1980s and 1990s, virtualization was abandoned for the inexpensive x86 servers. The broad adoption of Windows and the emergence of Linux as server operating systems in the 1990s established x86 servers as the industry standard. The growth in x86 server led to new IT infrastructure and operational challenges such as:

        Low Utilization.
        Typically a server only utilizes about 10% or its total capacity and at times only 3% of the server is utilized. Organizations would typically run one application per server to avoid the risk of one application affecting performance on another on the same server.

        Increased Physical Infrastructure Costs
        The increasing business operational costs to support growing physical infrastructure skyrocketed over the years. Most computing infrastructure need to remain active constantly which results in high power consumption and costs that doesn’t depend on utilization levels.

        Increased IT Management Costs.
        With more complex IT environments, came the need for more IT admins. Organizations spent a great deal of time and resources on managing servers leading to the need for more IT professionals to take care of the maintenance tasks.

        Insufficient Disaster Protection.
        Downtime was not as easily preventable in physical IT infrastructures. Organizations were increasingly affected by the downtime of critical server applications and inaccessibility of critical end user desktops. From a more extreme perspective, the recent rise in threat of security attacks, natural disasters, and terrorism this has rose interest from business owners making them wonder how secure is their data?

        The recent increase in virtualization improved efficiency, utliziation and scalability of IT resources. By eliminating one server per application model, businesses were able to cut operation costs and decrease energy used. IT adminstration would no longer spend most of their time managing and instead innovating.

        This is just a short background on this “new” IT industry trend… the rest is history ;)

        Source: VMware

        One Comment

        Leave a Reply