Reducing Restore Times In A Disaster

It is the moment all system administrators dread, the whole systems down.

It may be a VM Ware purple screen of death on multiple hosts, a dead SAN or any number of other critical system failures. The net result is the same, the whole system is down and you need to start restoring data to backup systems fast.

Assuming you don’t have a warm standby system ready to go (which most companies don’t) your recovery is probably going to involve restoring a lot of data. Email systems, SQL databases, file systems and more. The biggest and most crucial of which is likely to be the file system.

Typically this will hold years upon year’s worth of data in a mass of files and it all needs to be restored at the same time. Even with the quickest backup solutions restoring 1TB of data can take a long time, hours in fact. Often the limitation isn’t the backup system itself, but rather the system the files are being restored too. Hitting the fairly common 4 hour recovery time objective is frequently impossible given the volume of the data to be restored.

What if you could restore only the most recently used files, skipping all the older files that your users don’t immediately care about in the face of a disaster? ‘Phil’ from Sales only wants one document at that moment in time, he just wants the PowerPoint show he finished at 2am in preparation for the client presentation that afternoon. He doesn’t care about sales figures from 5 years ago or the photos from the Christmas party last year.

There is a much smarter way to handle this situation, implement an archiving solution for your file system. A good solution will let you archive all your old files off to second line storage (the old NAS box in the corner of the server room maybe?) whilst leaving stubs behind in place of the files that have been moved.

They you can achieve a separation between your recently used files and all the old items that no one (especially Phil at that moment) cares about without impacting your users ability to access the older files if required. Once you have a separation between the old and new files you can back them up separately, meaning that when a disaster comes calling you can restore the recently used files first and then the older items later.

There are a few products on the market that can achieve this, but the archiving product we like best is Archiver.FS from MLtek Software. They are a small UK based businesses who have been developing Archiver.FS for nearly 10 years and they now have a client base that contains the likes of Wells Fargo and GEA. What’s more they have just announced a completely free licensing tier for their product so it definitely has to be worth a look.

There are a whole host of other benefits to deploying an archiving solution including ensuring legal compliance with the Data Protection Act (and upcoming GDPR regulations) as well as extending the service life of expensive first tier storage and reducing backup costs.

So don’t let Phil down and make sure you have his Power Point ready for him in time for his presentation, and who knows he may even buy you a beer to say thanks.

About JhonArthur

HI,I m Fenix Raw and working on SEO for many years and have experience on all SEO tool such as submissions, guest posting, backlinking, link wheel, social bookmarking, and many others. I ensure your dedication, punctuality and honesty from my side.

Leave a Reply