Juggling storage challenges with unified management: How to avoid dropping the ball

I don’t know about you, but I find juggling one ball hard, let alone three or 43. But keeping all those balls from crashing all around you is a little like the challenge organizations face as they try to store and manage their ever-increasing volumes of data.

And I do mean ever-increasing. Because Great Recession or not, data growth has continued unabated – thanks to the digitization of infrastructures worldwide, the need to keep more copies of data for longer periods and the rapid increase in distributed data sources.

When it comes to managing this tidal wave of data, there is no shortage of products and approaches to choose from. But most of these more traditional offerings have unfortunately not kept pace with the many new and complex requirements of storage, nor do they address the need for a single management perspective. [Read more…]

What is it that you want to protect from data loss the most?

This can be a very difficult question.  A question that has created many products and solutions (inside and outside of IT…think insurance.)  Other than the people, the most critical asset we have in most organizations is the information.  If it were not for that information we wouldn’t need all of the switches, routers, servers, and storage.  If we aren’t protecting the data that we value most as a starting point then what are we doing?

DLP (Data Loss Prevention) has been one of the dirtiest words over the last ten years.  It may be even considered worse than cloud, at least in security circles.  What made this term so unpalatable is the fact that it somewhat implies that without these solutions branded Data Loss/Leakage Prevention that we are losing and leaking data.  It implies that these solutions are the silver bullet.  The end all to be all.  Obviously marketing gone wild.  There is definitely some merit in this though.  And applied correctly with other technologies can actually provide a fantastic last line of defense, which used to be endpoint anti-virus.

If the attack traffic got through the firewall, then the Network Intrusion Prevention System, then hopefully the endpoint anti-virus software would pick it up.  But what happens when it doesn’t?  What happens when the system has been compromised without detecting for a while.  This is where the data starts to get pulled out of the network, or exfiltrated.

Assuming there is a compromise, let’s delve into our solutions that make up a DLP strategy and provide some examples of when each of them is used.  Ideally you’ll find which of these following solutions fit best in your environment today. [Read more…]

Bigger, Faster and “More Efficient” Doesn’t Always Mean Better

In today’s dynamic and ever changing IT landscape there is a lot of emphasis on purchasing technologies that do more with less, increase performance, and make existing approaches more efficient. Clients are turning to their trusted advisors and asking them to sift through all the stories, FUD and hype in the hopes that their solution providers will help them architect a strategy that utilizes the newest technologies to increase competitiveness, all while reducing total cost of ownership.

The single greatest advance in this area, at least in my opinion is the virtualization of servers, which has helped clients consolidate silo’ed resources and management structures, while increasing performance, availability and reducing TCO in massive ways.

Another area in which massive savings have been found is in the de-duplication of data within an IT environment. This is a tactic employed to reduce the amount of data that resides in an environment, both on primary storage systems, as well as in the backup stack in an effort to reduce the strain on networks, as well as the time and money spent on expensive disk technologies.

While both of these tools can provide massive savings in capex/opex to clients when implemented in the right way, they can also cause as many issues as they solve if not properly thought out and managed through their life cycle.

Was That VM Ever Really Needed??

When working with clients who have been virtualized for a few years now and have moved onto standardizing the virtualization of all applications that are supported in a virtualized state, the ability to create services so quickly (VM’s) can be an issue in itself. [Read more…]

Virtual storage capacity management: an admin’s worst nightmare?

Most hear “server virtualization” and think: efficiency, ease of management, high availability and flexibility. But these benefits – the aim of sound IT planning – really only extend to the server (and in some cases application) layer. Administration, it turns out, is a whole other kettle of fish.

That’s because the complexities of introducing server virtualization into an environment force administrators to spend far more time than in the past on planning the overall capacity requirements of an environment and how to lay down data to ensure that the benefits virtualization brings to servers isn’t offset by problems to the storage environment.

Here are the three most common technology features created to help alleviate this pain point – as well as some of their pitfalls:

Thin Provisioning: Thin Provisioning allows administrators to show an OS/App/Hyper-visor an amount of storage they can grow into, while not actually allocating the physical space on the SAN/NAS. The SAN/NAS only allocates storage as data is written to it, so administrators can spend far less time planning — and only need to purchase and allocate what’s needed NOW, versus in 6 or 12 months.

DWhile thin provisioning provides a lot of value – extending existing capacity lifespan and lessening the number of tasks to manage virtual machines and data — it also causes issues. [Read more…]

Ding, dong, the tape is dead: new storage systems handle both backup and archive

It was clear when DVDs hit the market back in the 1990s that they offered a richer movie viewing experience than VHS tapes. But for a time, tapes and VCRs were cheaper so we had to wait for the cost of DVD players to come down – and for the word to spread – before the new technology overtook the old.

It’s been a bit more complicated for IT departments to decide whether or when to switch from tapes to disk for their data backup and archiving strategy. Partly because, historically, tapes were higher density and cheaper than disk and because tapes were considered safer against loss, corruption or disaster. But using tapes often also meant dealing with poor recovery times, copying dozens of copies of the same document. Disks for their part got more dense, faster and less expensive. Not to mention that new disks could usually be added through cheap storage arrays or servers, while more tape usually meant a bureaucracy tape infrastructure – more towers, more robots and more tape drives. Still, in the mid-2000s, with all the advances working in disks’ favor, some believed that tape was undergoing a renaissance, pulling away again due to capacity limitations for disk. The result? A cold war truce of sorts between the two technologies – long-term archiving to tape, shorter-term backups to disk. [Read more…]

Mixed storage too much to handle?

For some time now growing organizations have responded to exponential data growth and the need for more storage capacity for their different applications by simply buying new disk storage systems. An additional server here, a storage area network there. It worked well enough in the short term but because organizations – particularly those in the healthcare, retail and government sectors – were essentially building separate and dedicated storage islands, data management complexity increased and so did costs. Not to mention the resource constraints – both physical storage resources and human resources – IT organizations face if this complexity and growth remains unchecked.

Among the many pitfalls and challenges related to these heterogeneous storage environments:

1. Poor storage utilization rates of different SANs
2. Increased time to administer the storage
3. Expensive to purchase snapshot/replication licenses on multiple SANs
4. Difficult to migrate data between storage systems
5. Increased energy and floor space costs

That’s where storage virtualization comes in – [Read more…]