Virtualization In Manufacturing
Manufacturing Business Technology took time to talk to Frank Hill, director of manufacturing business development, Stratus Technologies, about virtualization. Hill explains what it is, how virtualization can help manufacturers take advantage of the processing power they may already have, and how to mitigate some of the risk facing companies every day.
Manufacturing Business Technology: There’s been a lot of talk about virtualization in the manufacturing world, especially as more companies support virtualization with their software applications. Can you talk about where virtualization is at and how it might apply to manufacturing?
Frank Hill: Virtualization is not a new technology. It’s been around since mainframe computing. It allows a company to take a single, large computer and divide up its resources in a very safe, isolated way, so that multiple applications and operating systems can all run on a single piece of equipment.
Computer processors have gotten a lot faster — a trend driven by multi-core processors. Just a couple of years ago computers ran on a single-core processor, and then computers started utilizing dual-core processors (which is basically two processors in one socket). Now processors come with 10 cores, meaning you have 10 processors in a single server.
Most automation applications offered by industrial software companies don’t need much processing power. So what companies are doing, if they’re not using virtualization, is deploying 10 boxes (which include 10 computers and servers). Each one of these boxes is only using 10 to 20 percent of the memory capacity and processing capabilities of that machine. Obviously, this is a big waste because companies are buying a lot more equipment. Plus, they’re paying for power, maintenance and administration of these boxes.
So what virtualization allows companies to do is leverage the processing horsepower capabilities of today’s modern servers to run multiple applications without any conflict. When I say applications, I’m referring to virtual machines (VM). A VM is basically an operating system and an application (or a couple of applications) that are isolated so a company can run them on VMware or a single server. Each server can run multiple applications. One VM might be utilizing Windows 2008, another could be using Linux, and yet another VM could be running Windows 2000 — all without any conflict — on the latest, fastest chipset. This allows companies to take advantage of modern, reliable, inexpensive server technology without the need to buy multiple boxes to support that.
Virtualization is a well-known and well-utilized technology in IT. This year, it’s expected that 65 percent of new corporate applications will be deployed in a virtual environment. That means the application is not just going to be on the physical layers of a server — it won’t be one application, one database per machine. Companies can create a virtual environment where they can create VMs on top of individual pieces of hardware that can leverage the full capability and processing power of the servers. The applications will work, and virtualization offers companies tremendous benefits beyond server consolidation.
MBT: What are some of the other major benefits of virtualization?
Hill: Server consolidation is just the opening benefit to companies. There are also benefits around ease of deployment. Imagine if a company has a server and it adds a virtual layer on that server. The company might have three applications or three VMs running on it already, so it should replace three physical boxes. Say the company now has a new application it wants to deploy. If there’s enough processing power leftover on the physical machine, within minutes, the company can bring up a fourth VM to deploy whichever application that it wants in that virtual environment without purchasing hardware, without worrying about disaster or master recovery, and other protocols the company would have to implement. So it makes the deployment of new applications much simpler. It makes the management of that environment much simpler, because a company will have one physical box instead of 10.
It also makes upgrades a lot faster and easier. Say a company has a traditional, single server with an application running on it. If the company wants to upgrade the operating system or the application, they’d have to take that system down and plan a long period of downtime to install and test the new application before they can bring that server back up. In a virtual environment, they can simply create a new VM, load the new updated software to it, test it, bring down the running VM with the old software and bring up the new one. This dramatically shortens the planned downtime for doing system updates.
There are nice benefits around disaster recovery too. Companies can offload or backup VMs easily because it is basically just a set of files. Those sets of files are abstracted from underlining hardware. Traditionally, if someone put an operating system or application on a traditional server and wanted to move it to a new server, it would have to be the exact same server, because that implementation would be tied to the hardware. In a virtual environment, there’s an abstraction layer, so a company can take a thumb drive, plug it in and copy the entire VM including the operating system, the application and all the configurations. Someone can then walk that thumb drive over to another server running virtualization, plug it in and move that entire application to another piece of hardware.
MBT: What misconceptions about virtualization have you come across?
Hill: Many people feel that “anything new is scary.” Plants mainly supported by engineers or a plant manager are not being run by IT professionals. Because of this, there tends to be a lag between new technologies embraced by IT departments and the adoption of these technologies on the plant floor. The misconception on the plant floor is that adopting new technologies is going to be difficult.
There’s also concern around consolidating all applications into one physical server, because companies worry about what will happen if that server stops working. Well, obviously they’ll lose connectivity to all applications. So in the minds of plant managers, there’s some safety in separating all those applications onto different physical boxes. It gives them some redundancy and peace-of-mind that everything isn’t going to fail at once. They see consolidation as a risk factor. That’s a concern that companies like Stratus address with fault-tolerant computing. I guess the misconception is that virtualization is going to add risk. But that risk can be greatly mitigated or eliminated.
MBT: Where is virtualization heading?
Hill: In the world of corporate IT, 65 percent of new applications this year will be deployed in a virtual environment. I would say that 10 to 15 percent of automation applications are now being deployed in a virtual environment. That’s up from almost zero a year ago. Virtualization is gaining traction and I expect that half of all automation applications will leverage virtualization in the next 5 years.
Companies will eventually get over their fear of virtualization because there are so many proven benefits — which are clear in terms of ROI — that virtualization is going to become the standard. One place that you’re seeing this is in the application partners.
Only 2 years ago, Rockwell Automation didn’t support virtualization in their production automation applications. Now they’ve come out with support and are leveraging it as a competitive advantage. Other software providers like Invensys and GE Intelligent Platforms have also all come out with support for virtualization in their production environments. So the doors are open.
For more information on what Stratus Technologies has to offer, please visit www.stratus.com.