Manage servers in your sleep.

Further reducing routine maintenance of your environment with dynamic provisioning and automation.

 In the last post, I likened the virtualized application model to an organism of sorts, where each individual component can work together virtually as a dynamic pool of resources. But what happens when something isn’t working as it should?

 Well, that’s really where the next logical step in your organization’s virtualization evolution comes in, one central management console that understands all the pieces plugged in – in essence, one single point to look at where you can automate and resolve issues and reduce routine maintenance of your environment, what we call: dynamic provisioning and automation.

 For instance, say a server turns off in the middle of the night. Where you are already in your virtualization evolution, you’ve written the required rules to ensure that your applications move through or are taken over by other functioning parts of your organism, other servers, and they continue to run like before. This sort of automation ensures that you won’t have to do anything immediately. You may get notified so that first thing in the morning you can have a look at the physical environment and do some troubleshooting via your central console, but you essentially bypass the usual human remediation process and let the software handle the issue for you.

 The same goes for provisioning the amount of resources needed for any given application to run on a server. [Read more…]

Rolling out a ‘virtualize first’ rule for new IT applications.

 Virtualize first: The #1 rule for new IT applications.

 There are a ton of things you can’t completely predict when it comes to the growth, responsiveness and success of your IT infrastructure. But one that you can bet on, if you’re going to keep pace with change and become or remain a highly efficient and competitive organization, is the need for new applications. Whether it’s an enterprise resource planning tool, a purchasing tool, an HR tool or anything in between, applications will continue to expand in your data center.

 And no matter what tool, service or application you integrate into your IT infrastructure, one rule that should trump nearly everything else is its ability to be virtualized. Because as I’ve mentioned if you’ve adopted a one server per application model, it turns out you’re wasting a lot of resources that most organizations these days really can’t afford to. You’re stepping backwards in terms of your opportunity to be efficient and maximize your capital expenses over the long term.

 So how do you do it? How do you make a ‘virtualize first’ rule work when you’re looking at introducing a new application or service to your IT environment? Realize that with a one server per application model, you’re probably actually going to have to [Read more…]

Why buying servers for each application won’t cut it.

Still buying servers for each application? It’s time to get on board with virtualization.  

Inertia can be a powerful force. Whether it’s the inertia that comes from a lack of understanding and fear of the unknown or the kind that comes from believing something ought to be done a certain way because, heck, it’s always been done that way, so why change? Either way, inertia on the IT front is a recipe for watching your business fall behind.

The reality is, at least when it comes to virtualization, if your organization is not on board in a very real way, and you or your organization still believe that it makes sense to buy a server for every application you’re using, someone’s missing some of the most basic efficiencies that virtualization offers – for instance, needlessly increasing electrical consumption disproportionate to the value of the application and needlessly increasing administrative burden on IT staff. What that means is you can be seriously overspending and under-delivering for your organization. That’s the bottom line – literally and figuratively.

But, you might be thinking or dealing with people in your organization who resist, telling you: “I heard that not all applications can be virtualized, so why bother? It’s just too risky.” That may be one of the most common misconceptions of virtualization and it happens to mostly be wrong. The truth is, if diligence is done to properly design an environment for an application, there are very few that cannot be virtualized. And even if there is some risk [Read more…]

Expanding virtualization into the Cloud: Is it time to pop the question?

Let’s face it. You’ve been spending a lot more time with virtualization lately. You weren’t sure how into it you were when you were first introduced. But be honest, now you’re head over heels. You like the way virtualization helps you reduce waste, outages and labor, how scalable it is, how it makes feel when you maximize operating efficiency and performance – and, of course, the way the cost savings from virtualization make you smile.

The two of you have done a lot of exploring and now virtualization is looking for something more. There’s talk of expanding the relationship, of making it official. You’re dreaming of building your organization with virtualization and living happily ever after in your own private Cloud. But something is holding you back. Is it a fear of commitment?

You’re not alone. Some 65% of organizations are in the exploring state but only 30% have made the move to expanding to the Cloud. Like you, many IT professionals worry about the technical challenges of taking that next step. Will you have a free hand when it comes to moving business applications into the virtual realm? How will upping your organization’s virtualization affect performance? Will you lose control as competing departments flex their muscle and demand dedicated servers? What about organizational issues, procedures, red tape?

In short, what if it’s just one big nightmare and you can’t turn back? [Read more…]

Don’t put all your eggs in the Desktop Virtualization basket

It is extremely important to avoid the pitfall of an “all-or-nothing” approach when creating a desktop delivery strategy. You want to take the time to develop an in-depth view of diverse use-cases and chart out the best course of action for each one. This is the only way to ensure delivery of an optimal desktop experience to users across the entire organization.

A survey released by the Enterprise Management Associates in September found that companies with desktop virtualization projects in place or underway were almost all using more than one method of delivery, ranging from traditional terminal services to server-based applications accessed through a Web browser, according to Andi Mann, VP of research for the Boulder, Colo. consultancy. This CIO Magazine article explains the 5 most popular flavors of Desktop Virtualization and the advantages and disadvantages of each.

But how do you really know which method is right for which user? Or which users that Desktop Virtualization isn’t a fit for at all?

Specific user groups within your organization have unique requirements, each demanding their own approach and treatment. So it’s not a one-size-fits-all solution. Softchoice has created a step-by-step guide to help you understand the stages you need to go through when determining if Desktop Virtualization is truly the right fit for your organization. Each stage contains questions that you need to ask yourself WAY before moving towards the Proof of Concept stage that so many manufacturers want you to start with.  You can find this guide at isDVrightforme.com along with a forum full of desktop virtualization advice from others that have already gone down the DV road.  We figure when it comes to big technology decisions, there’s always room for a second opinion. And a third, and a fourth… so we’re hoping that the value of collective wisdom will be better than any one expert.

The benefits of Desktop Virtualization are many. With most of the computing happening in the data center you get greater performance, easier manageability, tighter security and lower operating costs. But,it is not a cure all. You need to take a methodical approach when assessing how it will work in your unique environment. For us, desktop virtualization provides an opportunity to help clients effectively navigate a ‘high-risk high-reward’ proposition – and we take that responsibility very seriously. OK, not that seriously…check out the below fun video that we put together on the topic.

httpv://www.youtube.com/watch?v=g3tC6nl32dw

The Advantages of Virtualizing Mission-Critical Applications

“Mission-critical” means that reliability can’t be compromised. The applications on which business users rely every day to generate revenue, service clients and keep the company operating need to be as close to bulletproof as possible. In pursuing this, many IT departments are forced to over-invest in infrastructure, constraining budgets and making other investments impossible (or at least delaying them). Application virtualization can alleviate this problem, ensuring that Tier 1 applications remain sufficiently protected without creating a disproportionate burden on your IT budget.

Quite simply, application virtualization makes sure your applications come first – not your budget.

The issue of protecting the availability and performance of mission-critical applications comes down to resources – like just about everything else in your IT infrastructure. When utilization spikes, especially unexpectedly, what is the result?  [Read more…]