Feeling The Pressure Of Big Data?

Feeling the pressure?

Over the years, data centers became fragmented, with numerous types of proprietary software living in silos inside specialized hardware components – making them complex and frustrating to manage. Today, virtualization helps absorb and minimize this challenge, but creates another: Server and application sprawl from explosive data growth.

Server and application sprawl will cost you

A sprawl of uncontrolled and poorly managed application deployments leads to ‘application unavailability’ that endangers an organization’s profitability. Numerous examples of this continue to sprout up all around us. For example, do you remember the Amazon datacenter failure? I always like to remember an article I read that states:

“As business becomes increasingly dependent on technology and information, availability is a universal concern for every business, in every industry…And  globalization means there are no more periods of  ‘acceptable’ downtime. At any time of the day or night, somewhere in the world, customers and vendors need access to your corporate information. If they can’t get it, they’ll go elsewhere – creating an opportunity for your  competition.”

David M. Fishman, Sun MicroSystems, Application Availability: An Approach to Measurement

I was young, and I have to admit that it touched me and created a sense of urgency. Maybe this is why I am so passionate about what I do today. Considering the problem of server and application sprawl, automation and ease of management are no longer a ‘nice to have’ they are a MUST. With this in mind, where do you start?

Today’s Data Center: a collection of bad IT decisions

All-too-often, when I walk into a customer’s data center, it reflects a collection of bad IT decisions. Too many static apps, too many servers and multiple data siloes endanger the profitability of an organization that is asking for overall thin management capabilities. This is certainly NOT the conversation we wish to have with a C-Level – at least I don’t!

Do you dream that your data, applications, Business Continuity and Disaster Recovery plan (BCDR) and data management are all part of a unique single-minded strategy? So much so, that the strategy makes up your ‘normal’ day to day datacenter operations – allowing you to focus on the core applications you need to esnure the growth of your organization?

For some, it seems too good to be true.

Sustaining growth in a siloed environment

Even with the issues mentioned above, we are all magicians somehow. Somehow IT folks sustain organizational growth with cost control year over year. Today, I believe that all of the pieces are available out there, and it is up to us to define where the information lives! Now, there are many different approaches to solve this, but we must agree on this: With or without a single-minded data management strategy, organizations must build or buy mission critical applications that streamline business processes.

The pressure of big data

Big data, also known as ‘information explosion,’ is the effect of a rapid increase in data that hits medium and large organizations hard. While the amount of available data grows it presses on data siloes, and the problem of sorting, storing and handling all of this information becomes increasingly difficult. Bottom line: this is why data management solutions should be a core component of your BCDR strategy.

Do we hope this pressure will stop? Yes. But will it stop? Get ready friends, the world is expecting a data explosion of 130 Exabytes. According to the Cisco Visual Networking Index (VNI) Global Mobile Data Traffic Forecast for 2011 to 2016, worldwide mobile data traffic will increase 18-fold over the next five years, reaching 10.8 exabytes per month – or an annual run rate of 130 exabytes by 2016.

Hold on a minute…2016? That’s in 3 years.

For example, Softchoice recently released a data management engagement for a large Canadian customer. They experienced rapid data growth and felt unable to maintain control of how their IT environment was handling the data. Why did this happen? Well, the consumerization of IT creates unstructured data, and unstructured data represents an average of 80% of the entire data center, leaving 20% for structured data.

In this case, since the available storage capacity per disk had increased, IT simply added more disks to solve for data growth and save money. However, the added capacity was a retroactive approach to solve their data growth problems. IT must maintain control of what, where, when and how incoming data is stored in the data center – at all times. With this in mind, IT must proactively search for holistic data center solutions to control and manage the growth of unstructured data.

How much do we fear data will grow in each organization alone? Gartner Director, April Adams, reported in 2012 that data capacity on average, in enterprises, grows at 40 percent to 60 percent year over year. This is much more than the 30% year over year we’re used to, don’t you think?

Unstructured and structured data is the lifeblood of all organizations, and I have seen more and more customers turning to Softchoice struggling with an increasing amount of data stored in their datacenter and asking us how to manage the exponential growth of data and the consumerization of IT (link).

The bottom line: data management is critical

While the ability to store increasing amounts of data empowers organizations, it also presents them with the challenge of managing all of that information.

I like to call this activity “balancing the data center” and I believe achieving this successfully is made of a precise combination of top-notch technologies with a strong roadmap that integrates well with your unique business processes. Bottom line: we want a solution that and is capable of automatically adapting to a changing environment and this solution must be easy to manage through policies while relying on equipment from top manufacturers in the industry.

Want to know more? Are you interested in understanding what Softchoice does differently and how we do it? What issues are you facing? Reach out, ask the question, leave a comment! Get prepared with the people that see it every day and are continuously looking at it. We surely don’t want to get caught in the wave of data.

Related Posts

Disruptive technology: Trends gaining traction Where would we be without computers, smartphones, music and video streaming? Likely, we’d still be using decade-old technologies like typewriters, landlines, CDs and DVDs. The...
Unleashing Productivity: Softchoice CEO David MacDonald’s address to ... The following is a keynote address delivered by David MacDonald, President and CEO of Softchoice, delivered at the CD Howe Institute in Toronto on May 18, 2016. Unleash...
The Real Cost of Software Non-Compliance (and what to do about it) Software compliance is one of those aspects of business that exists safely in the back of your mind until, suddenly, it rears its ugly head. Issues of software non-complia...

About Florent Tastet

Florent Tastet is a Softchoice Solution Architect. He is a subject matter expert on emerging technologies in server and EUC virtualization, datacenter operating systems, Storage data management and application virtualization. As an IT professional and leader, my objective is to help an organization grow its IT department with new and innovative technologies in order to have production at the most efficient level ensuring the right alignment in the deployment of such technologies through a precise Professional Services resulting in an extraordinary customer experience.