2 Principles Of Data Backup That Save $12,500 Per Hour

The 2 Principles of Data Backup that Save You $12,500 per Hour

According to a recent survey of over 2,000 SMBs, the average cost per hour of a data center outage is $12,500 for a SMB organization, and up to $60,000 per hour for a mid-sized enterprise. The report also indicates that less than 20% of SMBs back up all of their data, with 88% of businesses having lost critical data within the last two years. This of course doesn’t include businesses that are scheduling backups but have never tried to restore data.

Even in these instances, many organizations are unsure if these backups are valid until they receive an audit or need to restore data outside of their internal backup policy.

Here are the only 2 data backup principles you need to follow:

To prepare your business for data disasters, there’s no need to have a Master’s Degree in Disaster Recovery. You only need to follow two basic principles:

Principle #1: Back up ALL of your data – and don’t settle for anything less
I strongly believe that effective disaster recovery is based on backing up ALL of your data. When I say “ALL” the data, that means not settling for a backup that completed at 98 or 99%. Why? Let me ask you this: Would you leave the door to your asset warehouse 98% closed? The same logic applies to your business. Don’t wait to find out that the 1% that didn’t backup included key information, like internal communications plans, client contracts and business files you need for undelayed day-to-day operations.

Principle #2: Don’t waste time and space
Are you still backing up the same information more than once? I have some news for you: back it up once. Backing up the same files day after day, or from multiple locations, is a poor use of your time, storage resources and often lengthens your backup windows. With deduplication technology readily available there is no need to retain multiple copies of the same data.

Here’s some recommendations on how to get started with backing up 100% of your data on a daily basis.

  1. Tape-based backup : Backup software and a tape drive create a tape for both onsite and offsite storage. Although well-known and proven with respect to data retention, this technology doesn’t always meet the growing demand for greater speed and efficiency and may not keep pace as your data continues to grow.
  2. Disk-to-disk backup (D2D): This approach involves deploying an additional server or installing a Storage Area Network (SAN) to store data backups. The backups complete quickly and efficiently and restores can be done from several different points in time. Disk-to-disk backup aims to optimize virtual machines, backups and recovery by processing the required backup at the guest and image level through the host.
  3. Cloud-based backup: Backup data is transferred via a server or backup device through the Cloud (Internet) to a remote backup service provider. Cloud-based backups are almost always completed using the D2D backup method that compresses the data and transmits it efficiently. That said, the need to transport the files from remote locations lengthens restore times. It’s worth noting that a cloud-based solution might not be the right approach if data privacy is a big concern. Otherwise, with a backup cost as low as $.05/TB, a cloud-based solution is very cost efficient.
  4. Remote-server replication: In this instance files are replicated over a WAN or Internet link to a redundant server in real-time. This provides quick recovery for any problems with the production server. With the proper software, the replica server can stand in for the production server without users experiencing any interruption. Often referred to as “automatic failover”, or “hot swap”, this approach ensures a transparent and seamless continuity of business activities.

Stories about the financial impact of data loss run daily in the press. The question that’s always asked, yet never answered is this: How much money does your business stand to lose for every hour your workers sit idle, unable to deliver products or services to your customers?

When you’re ready to explore a backup plan that will keep your data and applications safe and ready to restore whenever you need, please connect with me directly.

Related Posts

Three Mistakes to Avoid When Managing Public Cloud As a cloud consultant, I’ve had the opportunity to see dozens of different public cloud implementations – the good, the bad and the ugly. Regardless of where you are on...
Unleashing Productivity: Softchoice CEO David MacDonald’s address to ... The following is a keynote address delivered by David MacDonald, President and CEO of Softchoice, delivered at the CD Howe Institute in Toronto on May 18, 2016. Unleash...
North Americans are (Still) Careless Users in the Cloud In 2014, cloud computing was still shifting into the enterprise mainstream. Back then, Softchoice commissioned a study that discovered North American workers were careless...

About Florent Tastet

Florent Tastet is a Softchoice Solution Architect. He is a subject matter expert on emerging technologies in server and EUC virtualization, datacenter operating systems, Storage data management and application virtualization. As an IT professional and leader, my objective is to help an organization grow its IT department with new and innovative technologies in order to have production at the most efficient level ensuring the right alignment in the deployment of such technologies through a precise Professional Services resulting in an extraordinary customer experience.