I came across an interesting story on ComputerWorld today. It discusses the recovery of lost files in one of those “emergency” situations. The author recommends defragmenting and discusses the use of various file recovery tools to assist in a modern day spin on the “damsel in distress” drama. Reading the article, you really feel for the anguish they went through, not something I’d want to ever go through. As noted in the article, the system on which the events took place was Windows XP. Windows Vista (Biz,Enterprise and Ultimate), to its credit, does include a type of file backup solution using technology called ShadowCopy. Just like you would use your camera to take a snapshot of a visual image, Vista will take a snapshot of your data. More specifically it takes point-in-time copies of your data so you can change back. While this can definitely help in disaster circumstance as described in the article, it isn’t as effective as Undelete, because Undelete is “event-based” . That means Undelete captures EVERY change, not just changes on an occasional time-basis, where snapshot type methods expose you to data loss in the gaps between snapshots. With Undelete pre-installed, the entire trying circumstances to restore lost photo’s depicted in the story would have, almost certainly, been averted. The real key to Undelete is that it is really “Data-Protection” more so than “File Recovery”. In the disaster events described in the article, a data protection technology would never have exposed the digital photo files to the possibility of being overwritten. Yes, Undelete does have some of the emergency file recovery features as well. One feature that Undelete does not include (because we concentrate on “data-protection”) is what the author noted in the article as “Raw” reads. There are a few fairly good tools on the market that do RAW reads for fairly affordable prices (one is mentioned in the article). One other such tool I can recommend is File Rescue Plus from SoftwareShelf (a Diskeeper Corporation reseller and close partner). Taking a step back and detailing some of the other comments in that story, the “protection” technology of Undelete would also mean that defrag (which typically increases the chance of file recovery) also would not cause the potential negative effect of overwriting the space that a deleted file used to occupy (because that “space” is now being protected). And when I say this (I’m sure that I’m preaching to the choir), you should have an automatic backup solution in place to ensure you have a duplicate copy of important data. Use another hard drive, or DVD/CDs to store copies. Don’t leave this up to a manual, every-once-in-a-while-when-you-remember-to-do-it-solution. Even a simple batch script that copies data from one drive to another is a start. While I already think that a manual approach is a bad idea when it comes to defragmentation, it is a REALLY bad idea when it comes to backing up your data. One product I like (and their US office is down the street from my house) is NovaBackup from NovaStor. One other personal recommendation is to never store data on your C: drive. I know this is kind of tough to overcome, because most PC’s come from the manufacturer with one hard drive formatted into one “volume”; the C: drive. If you aren’t already familiar with it, educate yourself on “partitioning”. There are numerous tools on the market to help with this, even if you purchased your new PC with a single 300GB C: drive. Personally I partition my PC, separating the operating system, from non-critical applications and data. I do store important apps (for performance reasons on the first partition of a physical disk) – using separate physical disks / RAID with parity, when possible. Taking it a step further, I also put the paging file on a seperate physical disk, from the OS, as well. IT Professionals who manage business servers practice this religiously. Also, in conjunction with a partitioning strategy, there are ways for system administrators to “hide” the system drive from non-administrative users on desktops and laptops. I’ve not seen this implemented that frequently in practice, but I still recommend this in a business setting (especially for those roaming laptops that store local data). While the author already described the fact that a C: drive is very active with operating system activity behind the scenes, the key reason in my opinion, to not store data on the same volume as the operating system is more basic. If you’re like me, you are constantly tweaking your computer or installing and uninstalling programs. If your operating system ever gets into some really serious issue and you need to re-build/re-install the operating system you are likely to overwrite that data you have stored on the drive. Pulling the hard drive out of one computer and daisy-chaining it to another to extract data is a real pain. Apart from that primary reason, there are a number of other good reasons. Due to hard drive physics, it is better (for performance) to use more smaller capacity drives than fewer larger capacity drives. Four 250GB drives are better than two 500GB drives which are better than one 1TB drive. It is also safer from the standpoint of RAID with parity – to account for physical disk failures. The one possible argument against partitioning, for the reason I noted, is the prevalence of free virtualization software, where you can do all the tweaking and experimenting in the VMs instead. You’ll just need to license more software. But then of course, the best practice for VMs is to place them on their own volumes / physical disk drives anyways. Another option, in lieu of seperating data onto dedicated volumes, is to do regular and full system “images” (e.g. Acronis, Ghost, etc). That works, but for overall practicality I prefer the solutions described. Those imaging solutions still make for good data backup solutions though. In the end, make use of volume partitioning or multiple hard drives (as appropriate), use a good automatic backup/imaging solution, and use Diskeeper and Undelete. That combo will give you an excellent foundation for a reliable computing experience. I’ll end this blog with one last reference. Tweakguides has an excellent manual for advanced and novice users alike. It’s put together by Windows guru Koroush Ghazi and I highly recommend it.