Too Many Backups! Here is how I consolidated my old archives.

What do you do when you have too many backups? That’s a problem I’ve been working on for the past two years and I thought it might be valuable to put some of this in an article for others who face similar problems. 

First of all, how many backups is too many? For me it was when I started stacking backup hard drives like cord wood. Since I started making 300mb film scans in the late 1990s, I’ve had a series of larger drives holding my files. When I upgraded to a larger hard drive, I always kept the old drive “just in case”. This has led to a lot of drives. Just this week I found four drives I didn’t even remember I had, all from the 2007-2009 time frame. Last week I pulled files from some truly ancient drives that I hadn’t spun up in about 15 years. 

Why did I keep all these drives? I wanted to preserve the ability to go back in time in case I found a file had been corrupted at some point. The only way to fix a corrupted file is to go back in time before it got corrupted and pull that file. That is the purpose these drives served. 

Having ten or fifteen ancient drives taking up space consolidated on one large drive seemed like a better option, and would help me reduce some of my ever growing clutter. 

Adding to my problem of excess backups is how Carbon Copy Cloner (my backup software of choice), backs up files. CCC works in a very simplistic way that is very safe, but in doing so, it is easy to make duplicate files when you move folders around. So on top of my duplicate “archive” drives, I had several terabytes of CCC files to deal with. 

My goal for this project was three fold. First, I wanted to consolidate all my archives on one drive. Second, I wanted to deduplicate (dedupe) those files to remove unnecessary duplicates so that it would take up the least space possible. Third, I needed to do all this in a way that ensured that I precisely copied every file, bit for bit, and that any duplicates were truly duplicates with no differences at the bit level. 

The tool that has made this possible is IntegrityChecker from Lloyd Chambers at https://diglloydtools.com and diglloyd.com. 

IntegrityChecker does a number of very interesting things. Foremost, it will compute a cryptographic hash for every file on a drive. This hash serves as a checksum that can show if a file has been altered in even the slightest way, down to the bit level. This is very useful when copying files to another drive to ensure they copied exactly. It also lets me compare them to the hash at a later date to detect corruption. It does some other cool things too as I’ll explain in a moment. 

Consolidation

So my consolidation process looked like this. 

  1. Use Carbon Copy Cloner to copy from my old drive to a folder on a new drive. 

2. Use IntegrityChecker to compute hashes for both copies

3. Use the “compare” function of IntegrityChecker to compare the copy to the original. 

This process let me make a copy of the old drives with absolute assurance that I had copied every file correctly.  In over 20TB of files copied for this project, I only found one file that did not copy correctly for whatever reason. Not bad for pulling data off vintage hard drives. 

Deduplication

Goal two was to dedupe the drive where I had consolidated all my archives and backups. IntegrityChecker helped with this too. IC can use the hashes it creates to look for duplicates. If a pair of hashes matches, you can be sure with a extremely high level of confidence that these two files are exactly the same. This is a much better way to identify duplicates than other methods that rely on file size, name, and date, because those values will not detect bit level differences from file corruption. IC can, so if IC says two files are duplicates, they really are. 

IntegrityChecker lets you deal with dupes in two ways. First, you can use a unique feature of drives formatted with APFS on Macs to create a clone. When a close is made, the two files will be reduced to one at the disk level but you will still see two files in the finder. If you open one of these files and modify it, it will become a separate copy again. Cloning files allows you do reclaim disk space from duplicates without messing up your directory structure. This is very safe, but would not help me with some of my other goals as you will see. 

I decided to go a more aggressive route. I wanted to remove every duplicate file, so I used the  “- – emit rm” command to create a list of duplicate files with the command line code  to erase them. This would remove them from the hard drive permanently, leaving only one copy.

Distillation

As part of this process, I realized I could delete any of the consolidated files that were part of my current, up to date, working drive and backups. After all, I didn’t need copies of files in my master working archive, so why not get read of those too?

To do that, I made a copy of the files from my current “master” drive (the drive where I access my photos when I’m working on them) and copy them to the drive I was using for consolidation. I put this them in a folder labeled “a” and put the old backup copies into a folder named “z” because I learned that IntegrityChecker will use the top most directory to decide which duplicate to keep. By doing this, I could make IntegrityChecker delete the old files that matched my current files. And at the end of the process, I could delete folder “a”, and be left with only the files that did not exist on my current master drive. 

This project let me distill terabytes of files down to about 300GB, which is a very manageable size to keep and maintain.  I consider it a success to be able to get a dozen or so hard drives out of my life and my space for the effort while ensuring that I have an absolutely exact copy of every one of my files. 

This process has worked for me but be forewarned. IntegrityChecker is very powerful, and it is very easy to delete files you don’t intend to. You need to take the time to learn how it works and understand its behavior. I did a lot of testing to practice and understand it, and I am careful to think through the plan every time I use it, in addition to working when I have a clear mind (always a good idea when doing big things with your data!) 

If you have the same problems I do, I hope this gives you some ideas for how to solve it. Courteous questions always welcome. 

My hard drives didn’t let me down!

This is a follow up to my April project to check the integrity of my backups as I was moving my files to a larger hard drive. 

My objective was to make sure that every single file (about one million) copied exactly to the new drive, and that there were no errors that would prevent me from accessing my data. 

To do this I used a software app from Lloyd Chambers called Integrity Checker which is the most efficient tool I’ve found for this unique job. It’s a command line tool that uses the Mac terminal. That in itself was a learning experience as previously I’ve been very afraid of how bad the wrong command in terminal could muck things up. 

Thanks to Integrity Checker, I was able to confirm that my two main backup copies are exact duplicates of the “master” hard drive. That’s a very good thing because it means I really do have a useable backup when my main drive fails. (All drives fail, it’s just a matter of when.)

My secondary objective was to verify some bare drives I was using in the past for backups. I had stopped using them because the were throwing errors in CarbonCopyCloner. I suspected that these errors were due to the drive dock I was using them in, but had no way to be sure, so I didn’t trust them. They got shoved into a drawer and were just sitting there as “worst case” backups as a hail Mary play in case I needed it if things every got really ugly.

To try and bring these orphaned drives back into my active backup,  I put them into a known good drive enclosure. Then, using Integrity Checker, I was able verify that every file on them matches my “master” and that the drives are trustworthy. That gives me confidence to use them again for backing up new data, and lets them be useful as part of my backup strategy.

The one thing that has surprised me as I completed this project is that everything actually worked. Terabytes upon terabytes of data and multiple copies of a million files that were hashed and read multiple times, and it all worked. Even digital photos from the mid 1990s were still there and readable. I think I found a dozen files that threw an error, but they were all readable so the error was insignificant and they were mostly XMP files. That has made me much more trusting of the process used to backup my data. A sigh of relief, but I’ll still remain vigilant. 

Another surprise was how many files I had duplicated on the drives. For a myriad of reasons, I had multiple folders with the same files that built up over the last twenty-ish years of managing my archive. One terabyte of duplicates to be precise.  It would be a nightmare to reconcile all those files manually, but Integrity Checker came to the rescue again. One of it’s functions allows you to identify duplicate files…that’s how I discovered the 1 TB of duplicates in the first place. 

But just as valuable was Integrity Checker’s ability to “clone” the duplicates and regain that wasted space if you are using a APFS formatted drive. 

APFS is a format for storage drives used with a Mac. It’s designed for solid state drives, not spinning disks. It will work with a spinning hard drive, but it can cause a slowdown in transfer speed. That’s something I could tolerate for backups if it let me get back a terabyte of space, so one by one I converted my backups to APFS, re-verified that all the files would read back correctly, then used Integrity Checker to “de-dupe” the drives and reclaim that 1 TB of space back.

The unexpected benefit of this de-duping is that I now have a whole new set of tricks up my sleeve to manage my storage more efficiently.

The end result is that I now know that every copy of my data is good, and I know how to check it as I go forward to ensure it stays good. This gives me more  confidence that my files will be there when I need them, which was the whole point of this adventure…and something I wish I had done a lot sooner. 

My next adventure is to take one of my offsite backups into the cloud using a Synology DiskStation and Backblaze cloud…more on that in a future post. 

Until then, keep backing up those bits!

File Backups – Checking for Copy Errors

One of my “safer at home” projects is updating my file storage system. The drive I put new camera captures on was getting close to filling up, so I needed to expand my system by purchasing a 8TB Seagate external drive that could hold the contents of a partially full 6TB drive, along with the contents of a partially full 2TB drive.

It took me many hours of copying with CarbonCopyCloner to transfer the files from both drives to the new 8TB drive, which always tests my patience, and my tendency to want to watch the pot to see if it’s boiled.

With the copy complete, the next to do is a major reorganization of my folder structure to better fit my current needs and to work with my backup scheme. As part of this, I’m going to erase and reuse some older backup drives, but before I erase those drives, found myself with a nagging question. Did my computer actually copy all my files correctly to the new drive?

Most file copy operations, including what I did with CarbonCopyCloner, are optimized for speed. They read the file from one location and write it to another without verifying that the file was written correctly. Verifying a copy would take re-reading each file and comparing them, which would take a lot more time. In the case of my 5.5TB of data, it would have to read 11TB total of data. 5.5TB on the original drives, and 5.5TB on the copies.

Since my copies weren’t verified, it’s entirely possible that when I copied my files to this new drive, files that hold decades of work, valuable drum scans, irreplaceable originals and memories, that some did not copy correctly, and I could be losing some data. I used to accept that risk in the past, but experience has made me less willing accept it going forward. So what to do then?

CarbonCopyCloner has an option to compare your backup with your original, but for the size of my archive, it was going to be a very time consuming project, and difficult to organize. Fortunately I remembered Lloyd Chamber’s IntegrityChecker software that was designed to do just what I wanted.

First a couple lines about Lloyd and why I’m trusting his software to check my files. Many years ago I met Lloyd when he attended one of my workshops. He was using 8×10 film at the time and trying to push the bounds of what it could achieve…no minor feat. His film was of fantastic quality, but he was still not satisfied. He’s the type of person who obsesses over details in a way I greatly appreciate. But he’s not just a photographer. He has a couple patents to his name for compression technology he used in his very popular DiskDoubler and RAMDoubler software. He has the knowledge and experience to get very deep in the weeds of some interesting computer and digital imaging problems, and he blogs about lens and camera testing at diglloyd.com.

IntegrityChecker validates files in a very unique way. It creates a cryptological hash for every file on a storage volume that can be used to check if the file has been changed in any way. This lets you check the integrity of files and backups in the most efficient way I know how.

So now I’m in the process of creating hash files for my “original” disks. Once all the hash files are created, I’ll use those to validate that my multiple backups are faithful copies of the “original.” That will let me have peace of mind that I have good copies of all my files, and let me decide which copies are redundant so I can re-use those drive.

This kind of integrity checking is something we should all do, but since it’s not built into the operating software we use, it doesn’t happen unless we seek it out. If this is something you’re interested in, check out IntegrityChecker on Lloyd’s website.

I think it’s important to note that this is more of a “expert” level tool. It’s offered in both GUI and command line versions, and it’s going to take some understanding of the underlying principles of what it’s doing if you want to apply it correctly. Because of that, it’s not a tool for everyone, but it’s one I wish I had started using a lot sooner. For now it’s the easiest way I know to ensure my files copy correctly and don’t change once they are copied. Check it out and see if it belongs in your toolbox.

Hard Drive Costs Late January 2020

Current hard drive costs at a glance with links to purchase from Amazon. I recommend Seagate hard drives because they continue to test as some of the longest lasting drives at backblaze.com.

Highlights for January include a minor price increase on 6Tb and 10TB external drives, as well as slight changes to internal drives as noted. The days of storage prices dropping quickly seem to be over as drive capacities become so large. Also of note is that 2Tb external drives are now all “portable” meaning they are 2.5″ laptop drives that are bus powered. For my main storage I prefer to have external 3.5″ drives that are plugged in to an external power source, so that means buying a 4TB drive or larger.

10TB external drives are still a big savings over 10TB Internal drives. Also, on a cost per TB basis, 10TB drives are getting close enough to the sweet spot of pricing to make them attractive if you need that kind of storage. But I generally don’t recommend buying more than a year’s capacity at a time to protect from price changes. Also remember that a properly backed up “storage set” requires three drives, so buying more than you reasonably need (over provisioning) can suck up a lot of money.

Sometimes external drives are less expensive than internal drives. Advanced users may want to explore “shucking” external drives to save money as the external drives are often, but not always, SATA drives that can be used as an internal drive.

EXTERNAL

2TB $59.99 ($30 per TB) 2.5″ USB powered portable drive
4TB $89.99 ($22.50 per TB)
6TB $109.99 ($18.33 per TB) +$10Change
8TB $139.99 ($17.50 per TB)
10TB $199.99 ($20 per TB)+$20 Change

INTERNAL

2TB $49.99 ($25 per TB)
4TB $79.99 ($19.99 per TB)-$10 Change
6TB $131.99 ($22 per TB)
8TB $149.99 ($18.75 per TB)
10TB $252.98 ($25.29 per TB)+$12 Change
12TB $327 ($27.25 per TB)+$15 Change
14TB $439.99 ($31.40 per TB)
16TB $484.99 ($30.31 per TB)+$6 Change

I’m an Amazon affiliate so I receive a small commission from each sale.

A Cheaper Storage Upgrade

Seagate 2TB External Drive

If you are sick of my articles on Drobo/NAS/DAS/RAID storage solutions because they are just overkill for your needs, you are in luck. I’m laid up with the flu, which is a perfect time to dump out some different storage solutions because it doesn’t require the same part of my brain the creative photography content does. 

Talking with a friend yesterday about some upgrades for his mac that was running slow and we got around to his current storage shortage.  (Yes, I have a lot of photographer friends, a side effect of this incurable disease I have called photography 😉

After helping him spend about $300 on a RAM and SSD boot drive upgrade for his 2015 iMac, the budget was tight for storage. He wanted to set up a new Storage Set that would be dedicated to RAW files, and include his existing archive of 700GBs of existing RAWs. (See my Freemium Backup and Storage Plan article for an explanation on what a Storage Set is. )  

He settled on buying three 2TB external drives for a total cost of about $179. One would be the master, and two would be exact clones using CarbonCopyCloner. This would let him transfer his existing 700GB of RAWs to the new storage set, and leave maybe a years worth of space for new RAWs from his 45mp camera. The $179 price is an easy bill to afford, and way less than film and processing used to cost, so even if it ends up being a little undersized, it gets him through till his high season for photo sales. 

Putting all your RAW files on a separate drive is a great way to segment your data. Since these files will never be modified directly, the backup needs are greatly minimized for that master volume. Your modified RAWs can live on a volume set aside for more active files in the case of Photoshop, or in your catalog for DAM (digital asset management) programs like Lightroom. 

So why not a RAID in this case? While RAID is a very nice to have, it’s not always a need as long as you are very diligent in doing regular backups. This solution works in keeping the data safe and accessible for very little money. 

My storage articles over the last few weeks weren’t meant to say you need RAID, but rather to explore what they do and how to manage them based on my experiences managing a lot of spinning disks in Mirrored RAIDs and Synology NAS systems. I used to be able to heat my office in with three Mac servers and forty odd hard drives West Coast Imaging required, so to say I’m very close to this subject is an understatement…lol. 

Sometimes inexpensive solutions are the best solutions, and as I shared with my friend, there are always more things to spend money on in photography. Saving money for him means more days on the road having more adventures and making more photographs. So “just enough” is always the right size. Owing spinning disks is not our goal in life. 

Bad Reviews for Drobo on Amazon

I’ve been doing a lot of articles lately on the Drobo. These came out of my experience working with a friend to upgrade his storage system and replace a 5 year old Drobo that failed. While I don’t own a Drobo, I understand the underlying technology and how to manage its RAID like storage from owing a Synology and previously managing Mirrored RAID servers for a long time.

My recent PetaPixel article have several comments from Drobo users who had bad experiences that piqued my curiosity. I dove into this thinking the Drobo just worked well based on the positive things I’ve heard about it, and peoples acceptance of it. And on paper it looks like a good DAS option that should be easy to use.

So I dove into the Amazon reviews (and B&H) to get a bigger sample of users, and I’m not too excited by what I see. The percentage of 1 and 2 Star reviews is pretty high for the rock solid reliability I want in a storage device.

I didn’t read every negative comment, and it’s nearly impossible to measure the experience level of every person commenting. But 25% plus total 1&2 star reviews stands out to me. Based on this new knowledge, I don’t feel comfortable recommending the Drobo. It might be a good device, it might not. But I don’t want to deal with the risk that those reviews are correct.

Even with a solid backup system, dealing with storage failures is a nightmare. I’ve been there enough to know I want to eliminate as much risk as possible. The time and stress to fix faulty storage is just too high a price for me to pay, let alone the experience you need to troubleshoot. I had thought the Drobo would be a perfect solution from non IT savvy photographers, but I guess I was wrong.

I still have a couple more Drobo articles I’m going to post, with links to this article. And then I’m going to work on some articles about SoftRaid from Other World Computing which I have considerable first hand experience with over nearly 20 years using it for mirrored raids on hundreds of drives. OWC also sells some excellent drive cases, some with built in RAID. They take a little more experience than the Drobo to use, but my experience with OWC is that they produce excellent kit. I’ve also used them as my RAM supplier for my businesses (at least 35 Macs upgraded), and my laptop has been running a 1TB SSD drive from them for the last 4-ish years.

If you are putting together a storage upgrade, I encourage you to give OWC a look. And look at my consulting services if you need some more in depth help.

Storage System Consulting

Need help ensuring your photos are properly stored and backed up? Losing the time, energy, and effort spent making your photographs, let alone the potential revenue they represent, is not an option.

 Let me help. I’ve built systems to serve the single photographer all the way to 20,000 clients and a million files. I can put together a simple but robust system that works for your individual needs. I focus on cost effective solutions because I don’t like wasting money on things you don’t need. 

Email me and let’s start securing your archive today.