Hooked On Storage Part 2: In-house solutions, heterogeneity and deep dives

In Part 1, we took a look at some of the big picture challenges facing IT pros when it comes to the exponential growth in storage needs. But what about the solutions? What factors should you be looking out for as you move to a more proactive – rather than reactive – storage strategy?

Gartner, in its latest Magic Quadrant for Storage Resource Management and SAN Management Software, describes an all too common approach to dealing with the internal storage crisis:

“A common method by which many organizations manage storage and attempt to obtain SRM data has been by writing in-house software, then manually transferring and combining data among sources using spreadsheets.”

Other organizations rely on storage-related tools included with the actual hardware they buy. But, as Gartner explains, that’s where one starts running into trouble:

“…this ‘build your own’ model is not sustainable … and is far more expensive and inaccurate than most organizations realize. This is because the costs of in-house software development and support are hidden burdens that need to be accounted for. Additionally, internal SRM tool development poses risks to organizations when, for example, employees change jobs, an organization may have to redesign and rewrite undocumented and unsupported legacy systems.”

In other words, a big headache. And it’s just the first of many. [Read more…]

Hooked On Storage Part 1: A Sustainable Approach To Storage Sprawl

Urban planners have long since realized that building wider and wider highways doesn’t actually solve traffic congestion problems in the long run. Why? Because while they increase capacity in the short term, they also encourage sprawl, which in turn generates more traffic and still wider highways – and the vicious cycle continues without ever solving the underlying challenges.

What does urban sprawl have to do with storage resource management (SRM)? A lot, it turns out. Enterprises are experiencing their own sprawl as growth in data jumps from half a terabyte per person per year just last year to multiple terabytes today – or put another way, not too long ago, storage growth used to be 5 to 10 percent a year, now it’s 200 to 300 percent and rising. Those are a lot of bits. [Read more…]

Why your data center needs to be more like a condo

In a condo, tenants each have control over their own units (access, decor, furniture) but at the same time they are sharing resources such as the pool, security, gym and landscaping.

They get a lot more, but at a fraction of the cost, compared to if they had to invest in those amenities on their own.

The same thing is now happening at the server, networking and storage level of the data center. Sharing hardware resources means you can do more with less, but you still maintain the security and control required at the application level.

Today’s traditional IT model suffers from resources located in different, unrelated silos—leading to low utilization, gross inefficiency, and an inability to respond quickly to changing business needs. Enterprise servers reside in one area of the data center, with network switches and storage arrays in another. Traditionally, guaranteeing application isolation requires dedicated, isolated hardware. Secure multi-tenancy enables you to partition a virtualized, shared infrastructure. Data is securely isolated, and workload performance is maintained.

The current separation of servers, networks, and storage between different business units is commonly divided by physical server rack and a separate network. By deploying an Enhanced Secure Multi-Tenancy virtual IT-as-a-service, each business unit benefits from the transparency of the virtual environment as it still “looks and feels” the same as a traditional, all physical topology.

If you would like to learn more about this topic, please watch my 2 minute augmented reality video. I’ll provide you with an overview into this new technology, from the palm of your hand. You need to download the PDF symbol, and ensure that your web-cam is set up.


How not to get stung by storage virtualization: Part 2

In Part 1 of this post, I blogged about how more and more companies are catching on to the value and importance of virtually storing their data. Storage virtualization offers the flexibility to simplify and modernize IT systems and control costs with as little disruption to a business’s data availability as possible. But while it can be a boon to a company’s ability to use available resources efficiently and cost-effectively, there are some risks.

I’ve already touched on two risks – failed implementation and challenges with interoperability. Here are three more:

Risk 3: The challenge of complexity

An important objective of virtualization is reducing and hiding the complexity associated with managing discrete devices. But while a virtual storage infrastructure benefits from a single point of logical disk and replication service management, there can still be some complications. According to informIT.com:

“…although shared storage represents a major technological advance over direct-attached storage, it has introduced its own complexity in terms of implementation and support.”

Plus: [Read more…]

Reading the radar: As we get back to business, where’s IT taking us?

We’ve learned a lot in the last 18 months. As cautious optimism surrounds economic recovery, what has changed in how organizations are harnessing technology? We assembled a group of Softchoice’s top technology thought leaders and asked them all one question:

“What’s on your radar as organizations get back to business?”

“In the storage arena it’s still all about efficiencies. Even as we pull out of the recession, storage requirements continue to grow between 20% and 50% annually, while head count stays static. The focus is on de-duplication, thin provisioning and archiving to reduce physical footprint and storage technologies that integrate management and provisioning features to help reduce the management burden.”
– Keith Baskin, Enterprise Storage Architect

“Now more than ever, to achieve competitive advantage you must understand your business and your customers. Technology gives companies an opportunity to know what’s going on in their business at all times in every aspect. It also allows you to plan and model future outcomes before you decide on them, resulting in better decisions and increased profitability. Business Intelligence and performance management software provides these abilities. Diving deep into your business makes you a more competitive organization as we get back to growth.”
– Jody Girard, Enterprise Software Architect

“It looks as though businesses are now taking the IT consolidation strategy into other areas of their IT infrastructure rather than just on the servers. So, IT groups are now being asked to take all of the multiple solutions they have for managing their environments and come up with more consolidated management platforms. Microsoft has really provided their customers with a strong play in this effort with renewed investments under the System Center umbrella. Businesses are able to take advantage of this single pane of glass to deploy, monitor, patch/update, backup/recover, and provide service/ incident management with the System Center suite.”
– Mark Wall, Enterprise Architect-Microsoft Solutions

“Now is one of the best times to get your IT house in order. Many of our customers are pondering the financial advantages of cloud computing while balancing that against the challenges it creates, namely security. Internal Cloud has to provide for ubiquitous access to a multitude of devices which are probably not inside the corporate cloud. Hybrid and public clouds carry the same challenges but “data in motion” and “data ownership” must be top of mind when working with third parties. Bottom line: the Cloud can make a lot of sense. It frees up capital from IT budgets to help with a customer’s core business, but if done wrong it could cost more than it is worth.”
– Frank Ball, Director – Communications Infrastructure Technologies