Enabling BYOD Through Mobile Apps

mobile apps

You’ve decided to open up your organization to BYOD — so what’s next? Along with mobile devices comes the need for mobile apps.

One question that constantly arises: Which of our business applications should be made available on a mobile platform? While common apps that employees use every day (like CRM) are solid contenders, not every app will – or should – be available as an app. Think about the practicality of filling out a timesheet spreadsheet on a smartphone. [Read more…]

Solving the Itanium saga

In what’s shaped up as something of a cliffhanger episode of The Housewives of Silicon Valley, Oracle’s decision in March to end software development for Intel’s Itanium platform still has everyone in the industry hot and bothered – and not in a good way.

No one – except maybe Oracle – is exactly sure why it dropped support for Itanium. Some believe it’s simply an economic decision – the number of Itanium-based servers sold isn’t high enough for Oracle to justify spending resources on supporting versions of its software on the processor. Others with a more conspiratorial bent see the move by Oracle as an attack on HP’s high-end server business in favor of its own SPARC/Solaris servers, which it acquired last year from Sun Microsystems. As for Oracle itself, it simply says it’s focusing on Intel’s x86 processor line and that Itanium was nearing the end of its life.

To be continued….

Whatever the reason, Oracle promises to continue providing customers with support for existing software running on Itanium. But support isn’t development and customers will likely become increasingly irritated as Oracle rolls out new applications and database features that aren’t available on Itanium, not to mention concerned with questions about security alerts, data fixes or critical patch updates. In other words, Oracle’s suspension of future development could raise user investment risks, with HP-UX users needing to re-evaluate their platform and life cycle support options.  

Options, options, options.

What to do? First off, organizations running Oracle on Itanium that plan to do so for several years should upgrade to Oracle Database 11g Release 2 as soon as possible to maximize the time available with Premium Support — officially until 2015. For customers running Oracle on Itanium that plan to move shortly to a new Itanium platform, they need to [Read more…]

Virtualization in Linux environments: The Penguin wants to play.

While it’s true almost everyone is gung-ho about the benefits of virtualization these days, many organizations end up missing the boat on maximizing its greatest benefits – increased server utilization and consolidated workloads, lower energy costs, increased flexibility and easier system management.

 Why? Maybe because they view virtualization as a magic bullet to solving all their hardware and computing problems. It’s not. And it turns out whether you’re virtualizing a Linux environment or any other, there are steps to take, best practices to follow and pitfalls to be wary of to ensure virtualization implementation and ongoing management run smoothly.

 Linux.com’s Joe ‘Zonker’ Brockmeier offers a wealth of common sense suggestions –ones that are often overlooked or taken for granted – here and here to help you virtualize successfully in Linux environments:

 Define your goal, perform and inventory and set up a roadmap: Understand what you want to accomplish, have a well-defined set of goals, identify hardware that will be freed up or phased out, and come up with a detailed requirements document that outlines the hardware you’ll need, as well as storage, management and possible solutions. A lot of this may elicit a ‘duh,’ but issues later can often be traced back to insufficient planning and goal-setting early on.

 Beware of “virtual sprawl”: Just because it seems easy to deploy a virtual machine doesn’t mean you should. “It’s important to manage virtual machines as if they were [Read more…]

Get in Front of Growing Data

You’re storing more data than you need to. Cut the excess in your infrastructure, and you can benefit from efficiency gains, cost savings and budget that can be redeployed to high impact IT initiatives. All it takes is a little data deduplication. Lower the amount of data you store without sacrificing what you need, and maintaining the status quo becomes less expensive and more efficient.

Your organization generates a lot of data, yet not all of it needs to be stored. There are redundancies in systems throughout your data center, and by identifying them and getting rid of the duplicates, you can cut your storage footprint, invest less in equipment and streamline your datacenter operations. Ultimately, the cost savings can be redeployed to other IT initiatives, particularly if they come with a compelling ROI proposition.

Using a data deduplication solution, you can reduce the amount of data you store by up to 60X. This translates to backup times that are 90 percent faster and a drop in bandwidth consumption of up to 98 percent. Quite simply, the implications of data deduplication involve a clearly defined storage management advantage. These are results you can see … and measure.

As your company continues to generate data (which needs to be stored), the use of data deduplication solutions enables you to make room in your storage infrastructure, rather than purchase new storage equipment. Imminent budgetary commitments, consequently, can be deferred.

Few IT investments deliver the sort of ROI that you can realize with data deduplication as part of a virtualized storage infrastructure. The implications are salient and rapidly realized. Get rid of the extra data that you’re paying to store, and the cost to operate your IT environment – and power the business – drops substantially. At the same time, you’re extending the value of your existing storage environment well into the future.

Data deduplication isn’t just a technology decision – it’s a financial one. Implement a deduplication solution, and you’ll succeed from both perspectives.

Prioritize Your Data for Compliance

Not all tiers of storage are equal, especially when it comes to compliance. Optimize your storage architecture for regulatory obligations such as Sarbanes-Oxley, HIPAA and PCI, and you can recoup your compliance spend, make audits less painful and redeploy the savings to projects that have ROI potential. As with every aspect of your compliance efforts, it pays to have a plan.

When the auditors come knocking, of course, you need to have all your data in order. But, it doesn’t all have to be immediately available. Instead of investing heavily in a storage architecture that treats seven-year-old data like that generated only a few weeks ago, you should prioritize based on frequency of need. As long as you can reach the information you need, you’ll be able to satisfy the requirements dictated by the regulations with which your company has to comply.

Newer data should be stored for easy and rapid retrieval. In addition to its use for compliance tasks, there are other business needs which make the ability to access it quickly a priority (e.g., customer care activity, order processing). Older data, on the other hand, can be stored on less expensive equipment designed for archiving, as the frequency of use is low, and the likelihood that fast access is necessary is low.

So, how does this turn into an ROI opportunity?

[Read more…]

How to Handle License Audits in a Virtualized Data Center

The old rules no longer apply. In the past, you knew how many processors would be committed to a particular application, and you made your software deals accordingly. Now, you face an ambiguous standard. What happens if you are undergoing a license audit during a period of peak demand?

Every IT innovation brings with it a necessary change in perspective …which tends to lag the innovation itself. In the case of server virtualization, enterprise application licensing has become the subject of disruption. Traditional licensing models don’t lend themselves to the flexibility of the dynamic data center. While there are no best practices on this in the market yet, you can be aware of the issue and work through remedies with your software providers.

In a physical infrastructure, per-processor licensing makes sense. The number is fixed, which makes the transaction relatively straightforward. This convenience is not available in a virtualized environment. Utilization may be fairly contained and predictable most of the time, but an unexpected spike in access – or even periods of anticipated peak demand – turn the model on its head. Going though a licensing audit when utilization is at its highest can result in an inaccurate view of general usage (and additional licenses). [Read more…]