When I sold my DSLR camera and lenses in 2016 I made the decision to be an iPhoneographer. I’ve been shooting with an iPhone since then and haven’t looked back. That was seven years ago! Crazy isn’t it? Especially since I am what I’d consider to be a “technical” photographer. Sure, I think about composition, aesthetics, and, sometimes, even what story the photo could be telling, but if I’m being honest, once I have all those things in place, I’m thinking about how the histogram reads as well as the shutter and ISO values and how they could affect the image.
When RAW capture was made available to the iPhone, I was using the iPhone 6s, and as soon as RAW was available, I started shooting with it. I may be an iPhoneographer, or mobile only photographer, but I take a photographer’s approach to my work. There are millions upon millions of people out there, posting to Instagram, who just point and shoot in JPG to get their photos and post without doing any edits, and that’s more than okay, but the guy who sold his camera all those years ago is still a photographer at heart.
I’m not going to get into the technical mumbo jumbo about what ProRAW is, or even what RAW is, because that’s been covered many times before by others and you can easily find it on the internet. I will, however, be referencing the technology a little bit just to help explain my reasoning. You’ll notice that the subtitle on this article reads, “And This Won’t Be For Everyone”, and there’s a couple of reasons for that. One: the masses who shoot with an iPhone generally don’t need to shoot ProRAW, and two: you need to deal with large file sizes which means file management is necessary. For me, my file management is simple. I use iCloud Photo Library and I firmly believe that I would be a poster boy for Apple’s purpose behind iCloud Photos. Don’t get me wrong, I don’t rely solely on iCloud for my backups. I have a firm system in place so I don’t lose any images.
I’m not putting myself on a pedestal here, but I just think I use that stuff the way Apple intended it to be used, and there are some cool benefits of doing things this way, but that’ll be another article on another day. ProRAW files are very large compared to their JPG or HEIF counterparts, and that’s because they are RAW files with Apple’s image processing included in the file. This means you can capture a ProRAW image with Deep Fusion, Smart HDR or Night Mode activated in the Camera app. That’s potentially a lot of information in a photo’s file, which is why I have an iPhone with at least 256GB of storage, plus I pay for the 2TB iCloud Storage plan. I should also mention here that in my Photos settings on both my iPhone and iMac, I have them set to optimize storage which automatically manages my device’s storage capacity by keeping the full resolution files in iCloud and smaller, space-saving versions are kept on device. As I write this, I have 21,167 photos and 1,021 videos on my iPhone 14 Pro Max. And this isn’t my first iCloud Photo Library. Paying for iCloud Storage is not something everyone wants to do, but it’s worth every cent if you take a lot of photos and are in the Apple Ecosystem.
We’ve determined that ProRAW files are large and why they take up so much space. Now let’s look at why I shoot this way. As I mentioned earlier, I take a photographic approach to my iPhoneography. This means I have the same mindset now as I did when I shot with my Canon, and it’s not just the shooting, it’s how I deal with my images after the shot. Admittedly, post processing on an iPhone today can be, and usually is, much faster and easier than it was with my Canon back in the day when I did it all on a computer, but the old school ways are still popular amongst some photographers. I’ll even do the odd edit on my iMac with some “big boy” software.
When you shoot ProRAW on an iPhone, the file has what’s called Local Tone Mapping (LTM) applied to it. This is the image processing stuff, or the image pipeline that Apple adds to the file. A ProRAW image will look the same on your iPhone as a JPG or HEIF. The only difference is you will see the word “RAW” in the top left corner of the screen when previewing the photo in the Photos app. In Apple’s native Camera app, if ProRAW is enabled in the Camera Settings you will see a “RAW” button in the Camera’s interface and if you take a photo with that turned on, it will be a ProRAW image. Apple’s Camera app can’t take a regular RAW photo. Many third party camera apps, like my favourite, Reeflex, have the ability to shoot ProRAW or a regular RAW photo without the image pipeline added, minus the ability to shoot in Night Mode. This is what I would call “old school” RAW.
Shooting RAW the old school way has its benefits, no doubt, but as you know, RAW photography needs to be post processed before you can do anything with the image. If you capture a photo of something where the lighting is of two extremes, like in the image above, a regular RAW image will most likely need to be processed twice, once for the sky and once for the foreground. You then need to stack the two processed images on top of each other and erase part of one to let the other show through and it’s just a lot more work than most people are prepared to do when it comes to mobile photography. Sure, I might be able to process this photo as a single file but stretching out the data in those shadows will end up showing a lot of noise. The only way around a situation like this, where you want to shoot RAW with such harsh contrast in a scene, is to use a graduated neutral density filter, just like the ones used on traditional cameras.
ProRAW produces photos with a vast amount of dynamic range thanks to the imaging pipeline, specifically SmartHDR. The way the iPhone does this is really quite incredible, but that’s computational photography for you. To put it briefly, the iPhone analyzes the scene, and in this case, realizes there is some sky in the frame so it will segment that and analyze it differently than the foreground, and basically process the image for the highlights and shadows separately, then put it all together for a single photo. It takes multiple frames almost instantaneously and does billions of computations to come up with the final result.
So now, here’s why I shoot in ProRAW. Here’s the image from above again, as a RAW file and as a ProRAW file.
The ProRAW looks like a finished edit, but it’s literally untouched. The RAW file is a hot mess. They were both taken using Reeflex and I just pointed, and took the shot. For the RAW shot, I could have adjusted the exposure for the sky but the trees in the foreground would be solid black with literally no detail, and had I adjusted the exposure for the trees, the sky would have been completely blown out. In both cases, I doubt I’d be able to retrieve the data that would’ve been lost. The ProRAW file is the RAW file, but with the image pipeline added by the iPhone’s camera API, which Reeflex has access to… or is it the same RAW file?
RAW Power, one of the apps I use to edit ProRAW files, has the ability to adjust the strength of the Local Tone Mapping, so one would think if you moved the LTM slider all the way down to the point where it should be turned off completely, the resulting RAW file would look the same as the photo taken in straight RAW. This doesn’t seem to be the case. I ran this issue by a friend who has a pretty good understanding of how camera sensors work and he says there is still some amount of processing in the ProRAW file even with the LTM turned off.
This is why some of the sky detail is still present, but I found it odd that the image was darker overall. Again, I’m not going to get into the technology behind all of this, but it’s interesting to see this happen.
Since a ProRAW file is in a DNG container, you have the processing flexibility that comes with such a file, whereby you can adjust the exposure properly, not just by increasing or decreasing the brightness, but literally setting the exposure like you would if you adjusted the EV before the shot. You can also adjust the white balance, which is setting the colour temperature of the raw data on a pixel by pixel basis, not just changing the hue of the processed image. Doing this sort of processing with the LTM intact, I think, is a much faster way to get an edit done, and, since it’s a DNG, it’s non-destructive in that when using a third party editing app, you can export the edited file as a JPG, HIEF or TIFF, depending on your preference and what’s available as an export format. When you edit a ProRAW file in Apple’s Photos app, it just saves it as the original file, not a new and separate image, but as you probably know, you can always “revert” a photo to its original state in Photos as long as it’s been edited there.
When Apple worked with Adobe to come up with the ProRAW format, their whole intention was to give photographers a head start with the editing process, and they had “more advanced” users in mind because ProRAW is only available on the iPhone 12, 13, and 14 Pro models as of this writing. There’s proof in the pudding with my sample images here. The ProRAW file really doesn’t need to be touched, although I could tweak it a little if I wanted to warm it up to make it look like it was taken during Golden Hour, or I could add a bit of sharpening, but I think it’s fine the way it is.
This example is somewhat extreme with regard to the range of lighting conditions. I’ve turned off the LTM in other photos where the exposure latitude is much better handled by the sensor and quite frankly, you really can’t see any difference. In a case like that, the plus side is that I have a RAW file to work with. For me, since I’ve been shooting RAW, even with my DSLR prior to selling it, I’ve just become accustomed to shooting in that format. I’m almost afraid not to in case I make a bad call on the camera settings before taking the photo.
This whole concept, or process, won’t be for everyone, that’s for sure, but it’s a way of life for me, photographically.