Don’t let data recovery times keep profits down

You’re an IT decision maker, and business continuity (BC) is an important component of your IT infrastructure. You understand that accidental or malicious data loss, unplanned system outages, user error, hardware theft or failure, power failure, software failure, fire, flood, earthquakes, landslides, hurricanes, tidal waves and tornadoes can blow your company’s data into oblivion.

Have you considered refreshing your backup architecture and processes with short recovery windows being the primary objective?

[Read more…]

Where Did My Data Go?

It seems that as we find newer, faster and more efficient ways to store, access and manipulate data, we can’t seem to keep up with the growth of the data itself. Even worse, we seem to be at odds with finding ways to successfully protect that data from being lost in the abyss.

Backups exist for one function, (No, it’s not to cause a nightly headache for your storage admin), it’s to facilitate the ability to restore data in the case of its disappearance. This can happen in many ways, and whether it’s from accidental user deletion, data corruption, failed disks, power outage or natural disaster, the result is the same… users scream “Where did my data go?!?!?!”

Many companies have complex backup schedules which utilize technologies such as disk staging, data de-duplication, virtual tape libraries, and physical tape libraries. But if the data itself can’t be restored, what good are the underlying technologies? Not much at all.

Many of the organizations I talk to focus all their attention on the “backup” process, but very few ever want to discuss the “restore” process. They spend thousands of dollars on nifty software that supports things like:

  • Data De-duplication – The ability to reduce data sets by only storing 1 copy of each block of data or file
  • Object Consolidation – The ability to create and amalgamate different data sets from different dates into one “synthetic/virtual” backup job. This allows them to run an “incremental forever” policy
  • Granular Recovery Functions – Very important within virtual environments as this allows administrators to recover full VM hosts, VM’s within a host, folders attached to a VM, or even single files within a VM folder
  • Zero Downtime Backup – Which is the ability to integrate onsite storage arrays with the application and backup stacks to provide fully application consistent  backups through the use of array snapshot technology.

All these tools help client reduce backup windows, add flexibility, speed and even granularity to their backups. They also increase automation and reduce user intervention. So isn’t technology a wonderful thing? And haven’t backups come so far over the years? The short answer is YES. But unless you can restore that data successfully [Read more…]

Data security: How to send hackers packing

In December 2006, TJX – the company that owns retailers TJMaxx, Marshalls in US, and Winners and HomeSense in Canada – found suspicious software on its computer systems. Three months later, TJX admitted that a computer security breach had occurred and that more than 45 million of its shoppers’ credit cards had been compromised.

The crew of hackers responsible was eventually caught. Still, an eight-month investigation by the Canadian government – TJX owns stores in Canada and so [Read more…]