In his latest video, MKBHD eloquently summarized and explained something that I’ve personally felt for the past few years: pictures taken on modern iPhones often look sort-of washed out and samey, like much of the contrast and highlights from real life were lost somewhere along the way during HDR processing, Deep Fusion, or whatever Apple is calling their photography engine these days. From the video (which I’m embedding below), in the part where Marques notes how the iPhone completely ignored a light source that was pointing at one side of his face:

Look at how they completely removed the shadow from half of my face. I am clearly being lit from a source that’s to the side of me, and that’s part of reality. But in the iPhone’s reality you cannot tell, at least from my face, where the light is coming from. Every once in a while you get weird stuff like this, and it all comes back to the fact that it’s software making choices.

That’s precisely the issue here. The iPhone’s camera hardware is outstanding, but how iOS interprets and remixes the data it gets fed from the camera often leads to results that I find…boring and uninspired unless I manually touch them up with edits and effects. I like how Brendon Bigley put it:

Over time though, it’s become more and more evident that the software side of iOS has been mangling what should be great images taken with a great sensor and superbly crafted lenses. To be clear: The RAW files produced by this system in apps like Halide are stunning. But there’s something lost in translation when it comes to the stock Camera app and the ways in which it handles images from every day use.

Don’t miss the comparison shots between the Pixel 7 Pro and iPhone 14 Pro in MKBHD’s video. As an experiment for the next few weeks, I’m going to try what Brendon suggested and use the Rich Contrast photographic style on my iPhone 14 Pro Max.