I wondered to what extent camera apps create different images of the exact same composition when using my iPhone X camera hardware. Do they all call the same hardware functions and therefore produce the exact same image? Or, do app-specific incantations exist that voodoo the image into existence creating variance in the rendered images of the same compositions? Unsure, I decided to conduct an experiment and write this post: iOS Camera Apps and Lens Comparisons.
The Camera Apps and Location Used
The four camera apps I compared were: the Apple iPhone Camera app itself, the overhauled and therefore new Moment camera app v3, Adobe’s camera app that lives inside iOS Lightroom CC, and the Halide camera app. For a location, I chose our indoor den area which is well-lit with large windows bringing in vast amounts of indirect outside light. I stepped back into the stairway to add both depth and a lot of indirect light from above (down the stairs). This setting also afforded an area behind and under the sofa (just left of center) that was heavily shadowed.
I took great care not to move the tripod or the phone in its mount1 on the tripod. I wanted each photo to be framed exactly the same as much as possible. The only variance I wanted to be seen would be the result of the lens, the app, and a tiny amount of outdoor light variance2 .
The Lens Combinations Used
And just to spice things up, I used the following lens with each app:
- iPhone X’s wide lens only
- iPhone X’s tele lens only
- iPhone X’s wide lens with the Moment Wide (18mm) Lens
- iPhone X’s tele lens with the Moment Wide (18mm) Lens
- iPhone X’s wide lens with the Moment Tele (60mm) Lens
- iPhone X’s tele lens with the Moment Wide (60mm) Lens
- iPhone X’s wide lens with the Moment SuperFish (120mm) Lens
- iPhone X’s tele lens with the Moment SuperFish (120mm) Lens
This means that I took eight photos with each app—a total of 32 photos.
I shot all images in the RAW camera format in each app. The Apple Camera app used the HEIC file format and the jpeg. All RAW (DNG) images were converted into jpegs at full size and full quality for on-screen comparison purposes. No post-processing is applied to any camera image. All of the images presented in this post are reduced in image size with medium jpeg compression (45% jpeg quality setting).
What does that mean? Well, you’re not seeing exactly what came out of the camera. So, I guess you have to sort of trust my untrained eye as I describe what I saw in the original files.
I let each app auto focus, auto adjust brightness, and auto adjust white balance. I also touched the same place (the black portion of the fireplace in the center of the images) for the focus point in every photo.
The biggest takeaway is the fact that each image is definitely different. I suppose that each app creates its own image from the raw camera sensor data. This makes sense. All of the apps produced satisfactory images.
I was careful not to move the tripod between pictures3 ; however, when I placed the images side by side in an image editing app, they are not in exactly the same position. And, interestingly, some of the images shot with different apps but the exact same lens combinations, seemed to have some different visual distortion as if the software applied a slight lens profile correction even though the software was completely unaware of what the lens combinations were other than which iPhone X lens (wide or tele) was selected for the shot.
Apple’s Camera App
I could not get Apple’s Camera app to use the telephoto lens on the phone even though it zoomed as if it did. I say this because I would place the Moment lenses over the telephoto lens, and the screen would be almost entirely black. The Apple camera was using the wide lens which was mostly blocked by the Moment lens situated over the camera’s tele lens even though I had touched the 2x indicator in the Camera app.
Apple’s camera app was the only app that had this issue. Maybe I should have rebooted the phone or killed Apple’s Camera app and restarted it? I don’t think this was caused by any settings for Apple’s Camera app.
The Moment Wide 18 and the Moment SuperFish 170 lenses did an impressive job. Clean, rather crisp, clear. I love them!
However, when using the Moment Tele 60 lens my shots lack focus precision in the peripheral areas of the images (top/bottom left and top/bottom right sides)—almost a tubular-type blur away from the image’s center. This was more noticeable when the Moment lens was mounted over the camera’s wide lens but can also be seen when mounted over the camera’s tele lens. This issue persisted across all of the camera apps I tested. (You can distinctly see this in the four sample images below: the two Moment Tele 60 on Camera Tele Lens and especially the two Moment Tele 60 on Camera Wide Lens.)
I got out the lens blower, used my little lens brush followed by the trusted microfiber cleaning cloth and cleaned both sides of the lens very thoroughly. Sadly, the issue persists. Since the lens looks perfectly clean on visual inspection, I am concerned the lens has some unacceptable level of distortion in the glass. I’ll reach out to Moment to see if this is how the lens is expected to perform or if I just have a dud.
Update: From the Moment Website…
“Why is my Tele lens so soft around the edges?
“The Tele is a great portrait lens, but we are starting to hit its limits on some of the new phones with larger sensor sizes. Because it is a portrait lens, you do see some roll off on the edges and corners, along with some slight vignetting depending on the device. However, the image should stay nice and sharp in the middle for awesome portraits. Check out this article that we made on the Momentist for Tele Lens Tips for Shooting Portraits, or if you are unhappy with the quality, you can return the lens using the instructions on our Shipping and Returns page.”
Adobe camera app in Lightroom CC
The Adobe camera app inside Lightroom CC produced noticeably more visual grain in the images than did any other camera app. The grain appears in both dark and light areas of the image. It’s very pronounced. The other apps did not generate that level of grain. I was surprised by this. The Adobe camera app has been my go to iOS camera app. I frequently feel compelled to use the de-noise tool in the Lightroom develop module to mitigate the grain.
I personally can’t see the grain in the small thumbnails below. To see the grain, click on each image. Then, the grain is very noticeable and that image is just half the original image’s size but significantly larger than the thumbnails below. Compare the sofa (shadows) and the glass bannister (brighter).
Interestingly, when you enlarge the images to 400% the Lightroom image has far fewer visual noise/artifacts than the other apps. Compare these images of the back of the sofa. The noise was uniform across each individual photo. (The dark line across the photos is the bottom of the back of the sofa.)
So it appears the Lightroom camera app is doing a better job. It has no gross artifacts and instead turns them into a much more palatable image grain.
While all of the apps had the most subtle tonal variations in the original images, the Halide app had the most notable differences. The image seemed to my eye to be ever so slightly softer as well. The fine texture of the sofa fabric was less defined—nothing a slight clarity adjustment wouldn’t change if you so desired.
I’m not saying this was good or bad. It was just a noticeable difference.
Some of The Images
This first image is the exact same shot with each app: the iPhone wide angle lens without any Moment lens attached. The image is split into four parts to show the general results produced by each camera. (It’s the best idea I could come up with.) Click the image to see the jpeg image at half the original size.
I’m personally battling over which app I like the most from a general usability and image quality standpoint. This is purely a subjective choice. As I mentioned, I have gravitated to the Adobe camera app in Lightroom CC. I love the clean, functional UI and the haptic feedback when aligning an image in the viewfinder. (Those tiny guidelines are great until the sunlight is so bright I can’t see them at all while wearing polarized sunnies. Then, the haptic feedback is essential.)
Adobe, where’s a live histogram?
In the image below, you can see the additional tools available when you touch the ellipsis in the bottom right of the image. (I don’t like icons over the viewfinder image.) Also, in the screenshot below, notice the image is perfectly centered horizontally (the yellow line across the image) and vertically (the yellow tack in the center of the rounded box in the very center).
But I also really like the Moment v3 camera app, too. It was recently released. It’s UI layout is also clean but a bit more limited, and I don’t like the wide/tele lens selection over the viewfinder image on the inside bottom right. I wish they would put that indicator outside the image I’m capturing.
What I love most about the Moment app is the extensive use of haptic feedback on the sliders (ISO, Shutter, etc.) making them feel very natural and precise. I wish Adobe used this as well! And I also love their “3D shutter” which mimics the behavior of the shutter button on my DSLR: “half touch” to focus, full press to take the shot.
If Moment would add lens profile corrections to their lens selector (that circle under the file format selector and by the histogram) I would be even more thrilled with their amazing app.
Compare The Moment App and the Adobe App Lens Combinations Shots
So below I’m including the 8 images shot by both the Adobe camera app inside of Lightroom CC (left column) and the Moment v3 camera app (right column). Seeing the image at half the size it was shot makes a huge difference in visual quality. So, I encourage you to click/touch each image.
Lens Combination Field of View Comparisons
Remember, the only thing that changed in these shots was the combination of lens used. The camera was mounted to a tripod that never moved.
The images on the left were shot with the camera’s wide lens alone or with a Moment lens over the camera’s wide lens. The images on the right were shot with the camera’s tele lens alone or with a Moment lens over the camera’s tele lens.
I love how far the iOS photo ecosystem has evolved. I can’t wait for full sized sensors with faster processors and the ability to shoot 8k video at 240fps. But that’s years away.
In the mean time, I love the Lightroom camera app, the Moment camera app, and the Moment lens. I can’t wait for the Moment anamorphic lens to hit the streets. I also have to give a shoutout to the FreeFly Systems people for the Mōvi Personal Cinema Robot—awesome image stabilization and some really cool shoot scenarios programed into the robot.