My hard drives didn’t let me down!

This is a follow up to my April project to check the integrity of my backups as I was moving my files to a larger hard drive. 

My objective was to make sure that every single file (about one million) copied exactly to the new drive, and that there were no errors that would prevent me from accessing my data. 

To do this I used a software app from Lloyd Chambers called Integrity Checker which is the most efficient tool I’ve found for this unique job. It’s a command line tool that uses the Mac terminal. That in itself was a learning experience as previously I’ve been very afraid of how bad the wrong command in terminal could muck things up. 

Thanks to Integrity Checker, I was able to confirm that my two main backup copies are exact duplicates of the “master” hard drive. That’s a very good thing because it means I really do have a useable backup when my main drive fails. (All drives fail, it’s just a matter of when.)

My secondary objective was to verify some bare drives I was using in the past for backups. I had stopped using them because the were throwing errors in CarbonCopyCloner. I suspected that these errors were due to the drive dock I was using them in, but had no way to be sure, so I didn’t trust them. They got shoved into a drawer and were just sitting there as “worst case” backups as a hail Mary play in case I needed it if things every got really ugly.

To try and bring these orphaned drives back into my active backup,  I put them into a known good drive enclosure. Then, using Integrity Checker, I was able verify that every file on them matches my “master” and that the drives are trustworthy. That gives me confidence to use them again for backing up new data, and lets them be useful as part of my backup strategy.

The one thing that has surprised me as I completed this project is that everything actually worked. Terabytes upon terabytes of data and multiple copies of a million files that were hashed and read multiple times, and it all worked. Even digital photos from the mid 1990s were still there and readable. I think I found a dozen files that threw an error, but they were all readable so the error was insignificant and they were mostly XMP files. That has made me much more trusting of the process used to backup my data. A sigh of relief, but I’ll still remain vigilant. 

Another surprise was how many files I had duplicated on the drives. For a myriad of reasons, I had multiple folders with the same files that built up over the last twenty-ish years of managing my archive. One terabyte of duplicates to be precise.  It would be a nightmare to reconcile all those files manually, but Integrity Checker came to the rescue again. One of it’s functions allows you to identify duplicate files…that’s how I discovered the 1 TB of duplicates in the first place. 

But just as valuable was Integrity Checker’s ability to “clone” the duplicates and regain that wasted space if you are using a APFS formatted drive. 

APFS is a format for storage drives used with a Mac. It’s designed for solid state drives, not spinning disks. It will work with a spinning hard drive, but it can cause a slowdown in transfer speed. That’s something I could tolerate for backups if it let me get back a terabyte of space, so one by one I converted my backups to APFS, re-verified that all the files would read back correctly, then used Integrity Checker to “de-dupe” the drives and reclaim that 1 TB of space back.

The unexpected benefit of this de-duping is that I now have a whole new set of tricks up my sleeve to manage my storage more efficiently.

The end result is that I now know that every copy of my data is good, and I know how to check it as I go forward to ensure it stays good. This gives me more  confidence that my files will be there when I need them, which was the whole point of this adventure…and something I wish I had done a lot sooner. 

My next adventure is to take one of my offsite backups into the cloud using a Synology DiskStation and Backblaze cloud…more on that in a future post. 

Until then, keep backing up those bits!

Simple Photo Processing – Less Is More

Don’t you wish processing could be easier? With all those sliders, it’s so easy to overdo things, but that’s not always the best choice. Let me give you a peek into how simple it can be if you get everything working together.

I process the photo in this video with just one global curve adjustment. Not all my photos process this easily, but when you get just the right light and subject, it’s possible. Watch the video and let me know what you think!

Grizzly Bear Processing Video

Capturing the feel of a large, wet, and hungry grizzly bear just a few dozen yards away can be challenging. In this video, I’ll show you some of my processing techniques that reveal the characteristics of the animal while holding the viewer’s attention.

I’ve processed many photos of grizzlies over the years, and every time I’m amazed at these huge creatures and the power they have.

My goal with wildlife photos is to help people experience what the photographer saw, and the many qualities of the animal that have to translate into the 2D medium of photography.

Thanks to Dan Brown for letting me show you how I processed his photo.

3D Walkthrough Of Robert Glenn Ketchum Exhibit

The Booth Museum has posted an amazing 3D walkthrough of Robert Glenn Ketchum’s latest exhibit.

The picture posted above shows some of the work I and my team at West Coast Imaging helped produce for Ketchum over the years. The three pieces on the back wall are 48×66 inch prints mounted to dibond which really have to be seen in person to appreciate the effect scale has. Big prints like these are time consuming to produce well, technically challenging but immensely rewarding when finished.

What this walkthrough doesn’t show is the many phone calls, back and forth mailing of proofs, and sweating the details to get them just right. Hours and hours often go into these larger prints, inspecting every square inch of the file for defects and working to bring out the artist’s vision.

The walkthrough works chronologically through Ketchum’s many projects, starting with the work of Elliot Porter that influenced Ketchum and his take on color.

You can find a complete list of the photographs that in the display here. The prints marked “Fuji Crystal Archive” were made by WCI.

I want to be sure to acknowledge the contributions of all the West Coast Imaging team members that worked to produce these prints over the years. Master Printmakers Michael Jones, Terrance Reimer, and myself all had a hand in the Photoshop processing at various times. Jeff Grandy did his magic on the Tango drum scanner to turn Ketchum’s original film into high resolution digital data. And of course the many other talented individuals who helped output, inspect, and ship the prints so they could be turned into this exquisite museum show.

Accurate lighting for film scanning

Using DLSRs as a scanner to digitize film has become a thing lately, and something I’ve been actively researching as I still have a sizable archive of film with plans to keep shooting B&W. One of the things you need to make good scans is a good light source. Most light sources do not illuminate across the full visible spectrum, which means that colors in your film may not show up in your scans. So I was really excited to discover Negative Supply offers a 99 CRI light source for scanning, and it’s made my wish list for a new scanning setup. I’m not expecting DSLR scanning to match a Tango drum scan, but I’d like to get the most out of the process and this will be a key component for color scanning.

RCS Thrusters – Space Shuttle Atlantis

Earlier this month, I had the chance to photograph Space Shuttle Atlantis at Kennedy Space Center in Florida. The space center had just reopened, and because of the ongoing corona virus epidemic, there were very few visitors that day. At one point while photographing, I looked up, and I was the only person there. Just me and this magnificent machine in a space that is normally flooded with people. It was magical.

Having it virtually to myself made this one of the most enjoyable days of photography I’ve experienced. It was a experience I won’t soon forget.

I was able to work slowly and deliberately, using my Sony A7RII like a miniature view camera to capture the intricate detail and work with the incredibly challenging dynamic range of white tiles in spot light and black tiles in shadows.

This photograph shows the forward reaction control thrusters with streaks from the intense heat of re-entering the earth’s atmosphere at ~17,500mph.

I envisioned these photographs in black and white from the beginning, and the photos I made that day halve already inspired more work. I can’t wait to return.

detail crop from original

Improved Expose To The Right

My latest online workshop takes a deep dive into how to make better exposures. Making a proper exposure is something you need to consider every time you make a photograph. It’s something you want to be certain about, because if you are wrong, you’ll miss the photo.

If you are ready to solve this problem, and be more confident every time you click the shutter, then this workshop is for you. The first three sections are free, so check it out at the link below.

https://rich-seiling-photo-workshops.teachable.com/p/improved-ettr

File Backups – Checking for Copy Errors

One of my “safer at home” projects is updating my file storage system. The drive I put new camera captures on was getting close to filling up, so I needed to expand my system by purchasing a 8TB Seagate external drive that could hold the contents of a partially full 6TB drive, along with the contents of a partially full 2TB drive.

It took me many hours of copying with CarbonCopyCloner to transfer the files from both drives to the new 8TB drive, which always tests my patience, and my tendency to want to watch the pot to see if it’s boiled.

With the copy complete, the next to do is a major reorganization of my folder structure to better fit my current needs and to work with my backup scheme. As part of this, I’m going to erase and reuse some older backup drives, but before I erase those drives, found myself with a nagging question. Did my computer actually copy all my files correctly to the new drive?

Most file copy operations, including what I did with CarbonCopyCloner, are optimized for speed. They read the file from one location and write it to another without verifying that the file was written correctly. Verifying a copy would take re-reading each file and comparing them, which would take a lot more time. In the case of my 5.5TB of data, it would have to read 11TB total of data. 5.5TB on the original drives, and 5.5TB on the copies.

Since my copies weren’t verified, it’s entirely possible that when I copied my files to this new drive, files that hold decades of work, valuable drum scans, irreplaceable originals and memories, that some did not copy correctly, and I could be losing some data. I used to accept that risk in the past, but experience has made me less willing accept it going forward. So what to do then?

CarbonCopyCloner has an option to compare your backup with your original, but for the size of my archive, it was going to be a very time consuming project, and difficult to organize. Fortunately I remembered Lloyd Chamber’s IntegrityChecker software that was designed to do just what I wanted.

First a couple lines about Lloyd and why I’m trusting his software to check my files. Many years ago I met Lloyd when he attended one of my workshops. He was using 8×10 film at the time and trying to push the bounds of what it could achieve…no minor feat. His film was of fantastic quality, but he was still not satisfied. He’s the type of person who obsesses over details in a way I greatly appreciate. But he’s not just a photographer. He has a couple patents to his name for compression technology he used in his very popular DiskDoubler and RAMDoubler software. He has the knowledge and experience to get very deep in the weeds of some interesting computer and digital imaging problems, and he blogs about lens and camera testing at diglloyd.com.

IntegrityChecker validates files in a very unique way. It creates a cryptological hash for every file on a storage volume that can be used to check if the file has been changed in any way. This lets you check the integrity of files and backups in the most efficient way I know how.

So now I’m in the process of creating hash files for my “original” disks. Once all the hash files are created, I’ll use those to validate that my multiple backups are faithful copies of the “original.” That will let me have peace of mind that I have good copies of all my files, and let me decide which copies are redundant so I can re-use those drive.

This kind of integrity checking is something we should all do, but since it’s not built into the operating software we use, it doesn’t happen unless we seek it out. If this is something you’re interested in, check out IntegrityChecker on Lloyd’s website.

I think it’s important to note that this is more of a “expert” level tool. It’s offered in both GUI and command line versions, and it’s going to take some understanding of the underlying principles of what it’s doing if you want to apply it correctly. Because of that, it’s not a tool for everyone, but it’s one I wish I had started using a lot sooner. For now it’s the easiest way I know to ensure my files copy correctly and don’t change once they are copied. Check it out and see if it belongs in your toolbox.

Counting In Full Stops

Do you know how to count in full shutter speed stops?

Even with all the auto settings available on our camera, this basic photo knowledge still can help us solve many exposure problems.

If you don’t have this chart memorized, take some time to learn it, and understand how cutting the amount of time in half cuts the amount light in half, and how doubling the time doubles the amount of light.

It seems so basic but understanding it gives you so many more ways to apply it to your photographs.