I have been enjoying an experiment that entails me using the iPhone 13 Pro as my “main” camera for all of 2022. I have been both super impressed with the results. There are times, when using the native iPhone Camera App that ships with the phone, the results have been mixed.
What I have learned already is that there are many third-party camera apps that fill in the gaps whether you are a stills shooter or a video shooter. But then again, there are times when, believe it or not, you should be just fine if you stick with the native iPhone Camera app that ships with every single iPhone.
The third-party apps I rely on
About filling in the gaps. I use many third party camera apps on my iPhone. Some of my favorites are:
- ProCamera: $12.99 (+$3.99 for HDR)
- FiLMiC Pro: $14.99
- Firstlight: $7.99
- DoubleTake: Free
- Spectre Camera: $3.99
- Beastcam: $4.99
- Pro Camera by Moment: $6.99
- Hyperlapse: Free
I rely on third-party apps for 90% of my video, but probably only 65% of my stills photos. There are others I use, but I find myself using these most often.
Each of these apps has specific value in specific situations. It would be much easier if I could just use one app and know it handles everything well, but I cannot find that app and doubt it will ever exist because everyone is different. What I think looks good in HDR may be different from what you think looks good in HDR. And that’s part of the problem all app developers must face, including Apple. Only it’s MUCH harder for Apple because so many people will just rely on their app alone — since it comes free with the phone.
Apple has to try to make that app work for the 89-year-old great grandpa who just wants pictures of the great grandkids, and photo nerds like me who will look for detail in both the shadows and highlights, critical focus, control over depth of field, etc. Finding a balance between the two is no easy task.
The main thing to remember about the iPhone and the native iPhone Camera app is this. The vast majority of iPhone users who take pictures with their phones just want the pictures to look good. That’s it. That’s all they care about. They aren’t interested in the how. They just want a great result that they can share with family, friends, etc.
With THAT goal in mind, the native iPhone Camera app delivers in spades. It is reliable. In almost ALL cases, it just works. And it works well. So there’s no shame at all in using JUST the native iPhone Camera app for your video and your photography.
But for some of us …
The problem with Apple’s native Camera app
The native iPhone Camera app is both brilliant and maddening all at the same time. And the reason is simple. Most of the processing in Apple’s app happens in the background. It is automatic. You never see the process — just the results.
Take HDR for example. When Apple introduced HDR as a feature on their native app, you could turn it on and off. Now, you cannot. Apple just decides when you need HDR and takes care of all the backend processing for you. Any many times, it does a simply brilliant job of it. However, there are times when I would prefer to do it the old-fashioned way but that requires a third-party app.
Back to another HDR example. Every time you take a picture using the latest version of the built-in iPhone Camera app, Apple is making hundreds of thousands of decisions per second (on your behalf) to decide how best to approach your image. In situations where Apple thinks you need HDR, it merges many shots (even though you pressed the “shutter” button only one time) and it uses some sort of computational magic to produce the final result.
And I have to admit, it often produces results that would have taken me much longer to do myself and I am often super happy with those results.
This applies to almost anything that I do on the iPhone — not just HDR. Today, the iPhone camera software essentially always “edits” your photos for you. The problem is, none of us know exactly HOW it does the edits. Apple isn’t the kind of company that likes to open the kimono!
The maddening automation with Apple
Apple keeps the processing methodology a secret but the savvy user can deduce how SOME of the features work. When it comes to portraits, we know it brightens faces, while at the same time retaining natural skin texture. When it comes to landscape shots, the iPhone knows to smooth the sky and punches up the color and clarity in the scenic. In low light it applies automatic noise reduction while keeping the details of the main subject intact. Apple’s Night Mode just magically appears when it’s nighttime and it does an amazing job of getting things just right. It’s all top-notch engineering, pulled off behind the scenes and all thanks to in-house chips optimized for these processes.
I will note that the new macro feature that comes with the latest phones and iOS did originally come into play in a fully automatic fashion. In this one case, Apple listened to customer feedback and in the most recent update to iOS, provided a way to turn macro off manually.
There’s are other automatic features in the iPhone’s native app that can drive me crazy such as auto sharpening and dynamic tone mapping. Sometimes those features work well, but there are enough times that the native app gets things wrong and when it goes wrong, it’s pretty easy to notice.
What defines a “good” photo?
Perhaps if enough people complain about a certain automatic feature that they want more control over, Apple will listen. But it’s not likely because the whole ecosystem is built around their AI and making everything look good, no matter what the situation. The problem is (as I stated earlier in this article) it’s hard to define what everyone thinks “good” looks like.
We know the vast, vast majority of iPhone camera users just want great results when they press the shutter button. Because this of this undeniable fact, Apple rightly designed an application that would benefit MOST users.
Professionals and aspirational professionals, even advanced amateurs may want more say in this process and that’s where the third party apps come in. Although I will admit to sometimes being lazy and just using the standard camera app to make an image because, well, it’s easy. I could accomplish anything that the automatic processing accomplishes in post. But that takes experience, access to professional-grade software and time.
And one thing that always surprises amateurs is this: True professionals often just want quick and easy. So we’re not that far apart from what newbies want. It’s just for different reasons.
But there are some features that a professional might need (like false color, zebras or waveforms) when shooting video that are simply not available on the native iPhone Camera app. This goes beyond not being able to make simple adjustments to existing features. This involves new features that Apple didn’t see fit to add to their app, but that third-party developers think we might find useful. In those cases, obtaining a third-party iPhone camera app makes absolute sense.
Apple helps to define the new era of digital photography with their app
Even the third-party apps take advantage of at least SOME of the iPhone’s computational software for photos and videos. Nobody knows how much of the API has been opened up to the third-party app makers, but I assume most of the important stuff is available to them. They can put their own spin on it and develop tools that serious amateur and professional photographers will want to use because they want full control.
But it’s important to note that SOME of the secret sauce in the phone’s special hardware and firmware remains SECRET. For example, there are times when I use a third-party app to turn off noise reduction or try to adjust it so it isn’t so heavy handed and it just doesn’t always work. It would be nice if Apple just let all the developers have access to EVERYTHING, but that’s not Apple’s way. It’s a company that is better at keeping secrets than the CIA and I don’t expect that to change.
Just know that there are plenty of times when the native camera app will do a spectacular job. Time-lapse (if you don’t need to adjust the exposure duration) and Night Mode come to mind. Also the new Cinematic mode is a real mind blower. Portrait mode can be fun and can be edited post capture (on the phone) if you want to play around with it to get the results you think look best.
For video features like Cinematic mode and if you work in a fully Apple-based ecosystem using the iPhone on a Mac with Final Cut Pro, you can even adjust this stuff in post. I still cannot believe how well it works.
For me, I primarily use third-party apps but certainly not exclusively. If you’re interested in experimenting with the deeper capabilities of your iPhone camera, take a look at the apps I list above. There are many more available and maybe even some that you will like better than the ones I like. That’s the beauty of this new era of digital photography. There are nearly infinite choices and there’s no doubt something there for everyone.
Have fun — no matter what app you use. And don’t let anyone make fun of you if your choice is to just stick with the iPhone’s native Camera app. It’s pretty amazing, warts and all. The coolest thing I can say about the native app is this: If you know what you’re doing, you can get really amazing results out of the device. Even if you DO NOT know what you’re doing, you can often still get amazing results.
I think that means Apple has done a pretty darn good job on their app.
Excellent article with great insights. I rely mainly on the native iPhone camera app but do use some third party apps like Snapseed & Leonardo (although Leonardo less and less). I’ve found that as I improve my photography skills, the less I’m needing the edits. Belonging to a number of FB photography groups, I see absolutely stunning photos but then find myself deeply disappointed when I learn how might LR and PS editing has been done. Guess I’m just a naturalist when it comes to photos and editing. LOL
I’ve used Halide and recently have been trying Reeflex as a third party apps. I’ve found nothing better than Filmic Pro for video.