Government looks beyond servers - Virtualization 6

Storage: the last frontier?

from GNC.com - Planned maintenance is the primary cause of downtime on a computer system. Storage virtualization is one way to mitigate or eliminate those planned downtimes, some experts say.

“We’ve been doing virtualization of storage through IBM’s Storage [SAN] Volume Controller for at least three years,” said Tony Encinias, chief technology officer of Pennsylvania’s Office of Information Technology.

“Virtualized storage reduces the down time when we have to restore or migrate data offsite,” Encinias said.

Virtualization of storage will a huge future requirement for the Arizona Department of Environmental Quality, Harkin said.

For instance, much of the department’s documentation work — retention schedules and permits — is done on paper. Now the department must digitally store all of that information, he said.

There are many different types of technology that can be used for storage virtualization. The right fit depends on the environment, ATS’ Smid said.

For an agency with a small IT shop that struggles with utilization issues — for example, an IT department does not use all its storage capacity or its storage area is tapped out –- there are relatively inexpensive technologies.

For instance, storage arrays from NetApp and Hitachi can boost the utilization rate and give those agencies the flexibility to move applications from one storage array to another or move one virtual machine to another.

With those solutions, the storage array is used as a front end and everything behind it is virtualized, Smid said. That technique can be effective in a small environment that does not have a lot of transactions that could act as a bottleneck, he said.

For large environment, companies such as EMC work with a storage-area network infrastructure. The company’s approach is to push the virtualization layer to the network, so the burden doesn't weigh on any single storage device. The SAN handles the virtualization, Smid said.

The solution also is more scalable, although it can be more expensive.

“But if you [have] utilization range in the 20 to 40 percent on your storage arrays and you can deploy something where you can double that, the cost of the virtualization infrastructure very quickly pays for itself,” Smid said.