In today’s dynamic and ever changing IT landscape there is a lot of emphasis on purchasing technologies that do more with less, increase performance, and make existing approaches more efficient. Clients are turning to their trusted advisors and asking them to sift through all the stories, FUD and hype in the hopes that their solution providers will help them architect a strategy that utilizes the newest technologies to increase competitiveness, all while reducing total cost of ownership.
The single greatest advance in this area, at least in my opinion is the virtualization of servers, which has helped clients consolidate silo’ed resources and management structures, while increasing performance, availability and reducing TCO in massive ways.
Another area in which massive savings have been found is in the de-duplication of data within an IT environment. This is a tactic employed to reduce the amount of data that resides in an environment, both on primary storage systems, as well as in the backup stack in an effort to reduce the strain on networks, as well as the time and money spent on expensive disk technologies.
While both of these tools can provide massive savings in capex/opex to clients when implemented in the right way, they can also cause as many issues as they solve if not properly thought out and managed through their life cycle.
Was That VM Ever Really Needed??
When working with clients who have been virtualized for a few years now and have moved onto standardizing the virtualization of all applications that are supported in a virtualized state, the ability to create services so quickly (VM’s) can be an issue in itself. [Read more...]