How not to get stung by storage virtualization: Part 2

In Part 1 of this post, I blogged about how more and more companies are catching on to the value and importance of virtually storing their data. Storage virtualization offers the flexibility to simplify and modernize IT systems and control costs with as little disruption to a business’s data availability as possible. But while it can be a boon to a company’s ability to use available resources efficiently and cost-effectively, there are some risks.

I’ve already touched on two risks – failed implementation and challenges with interoperability. Here are three more:

Risk 3: The challenge of complexity

An important objective of virtualization is reducing and hiding the complexity associated with managing discrete devices. But while a virtual storage infrastructure benefits from a single point of logical disk and replication service management, there can still be some complications. According to informIT.com:

“…although shared storage represents a major technological advance over direct-attached storage, it has introduced its own complexity in terms of implementation and support.”

Plus:

“Storage networking is still an esoteric technology and requires expertise to design, implement and support.”

And it’s largely in design that complexity issues arise – because virtualization requires new design ideas, and network-based and in-band (symmetric) implementations are often more complex to design.

Risk 4: Managing meta-headaches

With any implementation – whether storage virtualization or not – there need to be appropriate levels of back-ups and replicas for to reconstruct meta-data in the event of a catastrophic failure. But meta-data management can affect performance. That’s because some implementations restrict the ability to provide certain fast update functions, like point-in-time copies and caching.

Risk 5: Performance and scalability issues

The type of virtualized storage implementation chosen directly influences performance and scalability.

Virtualization’s mapping of logical to physical requires some processing power and lookup tables, and therefore adds small increases in response time.

The bandwidth into and out of the meta-data lookup software also directly impacts the available system bandwidth. In asymmetric implementations, where the meta-data lookup occurs before the information is read or written, bandwidth is less of a concern. In-band, symmetric flow through designs are directly limited by their processing power and connectivity bandwidths.

Most implementations provide some form of scale-out model, where the inclusion of additional software or device instances provides increased scalability and potentially increased bandwidth.

Beyond the risks, more rewards.
The bottom line may be that, while there are some risks associated with implementing storage virtualization, the benefits of non-disruptive data migration, improved utilization and fewer points of management make it an improvement worth seriously consideration for many organizations.

And besides, many of the perceived risks outlined above are increasingly being mitigated by innovative new cost-effective storage systems that offer exceptional ease of use, speed, efficiency and performance. For instance, to ensure a more secure and simple implementation, new SAN volume controller software has been designed with user interfaces that make storage provisioning easier and help users learn more quickly. And on the scalability front, individual, modular components can provide the flexibility to add capacity and functionality over time with less disruption than in the past.

For more helpful resources on storage virtualization, you can start here and here.

Related Posts

Are You Ready to Join the Revolution Against Outdated Data Centers? –... Editor’s note: our virtualization expert Stephen Akuffo weighs in on the Virtual Space Race Study blog series – see his notes below. Do you remember the days of dial up...
Isn’t It Time You Hopped on the Virtualization Bandwagon? – UPDATED Editor's note: our virtualization expert Stephen Akuffo weighs in on the Virtual Space Race Study blog series - see his notes below. You’ve probably researched how virt...
Change Your IT Role from Gatekeeper to Innovator – UPDATED Editors note: our virtualization expert Stephen Akuffo weighs in on the Virtual Space Race Study blog series - see his notes below. Long gone are the days of a single ser...

About Andy Thomas

With over 16 years of experience as an IBM Business Partner, Andy has worked in the open systems, server virtualization, and storage consolidation areas.