As smartphones continue to evolve, a particular area always comes into focus — and that’s the camera. Apple’s new iPhone 13 Mini, 13, 13 Pro and 13 Pro Max didn’t change much with the design or features, but they did push forward in the imaging space.
It wasn’t a crucial switch in the camera hardware or the addition of a new lens or set of lenses. So where do the improvements stem from?
The big change comes from all-new sensors that capture a larger amount of light to emphasize the details and improve the overall image quality. That’s coupled with some updated hardware, sticking with what has been working — and a number of improvements on the software side. It sets the iPhone 13 apart from competitors in that it can capture a scene more accurately than other devices on the market. Apple’s hardware does let more light in, but it also allows more details to be captured and intelligently uses software to properly take in a scene.
You can read our full reviews of the phones — including the iPhone 13, which we’ve named the best smartphone overall — but now it’s time to take a closer look at the camera.
Jacob Krol/CNN When it came to testing the iPhone 13 and 13 Pro, we took photos of ourselves — selfies with good or bad hair — our families and friends, along with pets, landscapes, plants and random things throughout our day. The goal is to test in the same way that we use our phones daily.
Both the iPhone 13 and 13 Pro now feature the largest sensor in an iPhone. Apple promised improved low-light performance with this camera, and in our testing we saw just that on both the 13 and the 13 Pro. Compared to last year’s iPhone 12, the 13 was able to take a clear image with less noise being introduced into the shot. Normally the culprit for those elements is the camera not capturing enough information, as a result of the hardware stack. The larger sensor here (which is paired with a wider aperture on 13 Pro) allows for more light to be captured.
With a nighttime shot at a pumpkin patch with minimal lighting, the iPhone opted to shoot in Night mode. With this, the device takes a series of shots at varying exposure rates with AI upscaling on top. That’s combined with the standard SmartHDR processing to properly pick apart a scene, and enable accurate colors and proper lighting for the different parts. It’s Apple’s take on computational photography, which is essentially using a whole lot of information to craft an image. It’s also better skilled at identifying elements in a shot year over year. If you look back to an iPhone 11 or older, you’ll see some large-scale improvements. The result is a really nice image, as you can see in the embed below. You can still see accurate colors of the ground and plants.
Jacob Krol/CNN “Long before you even hit the shutter, you just bring the camera up, we’re looking at auto exposure, white balance, autofocus to make sure that we’re getting all of the right information, raw information captured,” Jon McCormack, VP of camera software engineering at Apple, tells us. Essentially, this is Apple’s version of an auto shooting mode — the cameras in conjunction with the processor and software work to identify the right settings to take your shot. For example, if it’s a dark night it might slow down the shutter speed to allow more light to come into a shot. Or it might make a correction on focus if there’s a lot of movement in a given scene.
The iPhone’s advanced software smarts allow it to switch between modes — like Macro on the iPhone 13 Pro kicking in as you get closer to an object or Night mode auto handling an image dating back to the iPhone 11.
As one might expect, a lot of the focus in our testing was around the performance of the iPhone 13 and 13 Pro series. We quickly focused on testing in a real-world way — as we noted above — with selfies, photos of our pets, out and about at pandemic-safe events and with more advanced ones that incorporate multiple light sources. Since both of the phones featured a larger sensor, we wanted to dive deep on how that impacted images and video. Was there a noticeable difference? Did shots offer more details and improved lighting?
And with these side-by-side examples, the 13 did offer crisper images compared to the 12 — and in some cases, some that were noticeable even to an untrained eye. Ultimately, it’s a step up that improves image quality but doesn’t negatively impact how you take a shot. The 13 Pro or Pro Max had more direct improvements over its predecessor — more details even when zooming in, dramatically better photos in low light or quite frankly dark images, improvements to core modes and just better natural portrait.
“It’s much more than that [low-light performance] because the larger pixels allow the sensor to capture more rich detail and reduce noise,” says Graham Townsend, VP of camera hardware engineering for Apple. On the iPhone 13, the larger sensors were packed into a physically larger camera module with a wider aperture to let light in and a wider sensor to capture the light. That also leaves more receptors to grab the light and translate it over to a usable image. The larger sensor was a key part of designing the device, and one that Townsend’s team worked to integrate.
The iPhone 13 has more than 50% more light-gathering capabilities and stabilization (which we’ll unpack below) compared to the 12, while the 13 Pro gets 2.2x improvement year over year.
This comes together to give the software more information to adjust the image properly. It’s the explanation behind the more accurate colors and lighting sequences that we encounter in our tests.
Previous iPhones weren’t slouches, specifically the 11 and the 12, but properly reading a scene or letting enough light in could potentially be a struggle. With this shot of a bright sun that’s being pushed through clouds and several rays emerge, it could skew the colors of the orchard and the associated objects (trees, blades of grass, apples and pumpkin). The iPhone 13 Pro was able to tackle the image without overexposure or burning but also present colors in an accurate manner.
The other big bonus with the main camera across the 13 Mini, 13, 13 Pro and 13 Pro Max is image stabilization, which proved to be a big addition. Our hands shake and we tend to move when capturing a shot, and those little movements can have big impacts on the photos we take. Townsend’s team opted to spread optical image stabilization across the main lens, and this first premiered on the 12 Pro Max. Essentially the shot can be still for a longer period of time, and the actual camera will move the opposite direction of the phone’s movement to keep the shot still.
Townsend notes that “every exposure becomes shorter, which reduces generically subject motion blur for both stills and video.” Considering the iPhone 12 didn’t feature this, images side by side shot freehand look much better. And those upgrading from an iPhone 8, X or even an 11 will see a large impact here. There isn’t a blur effect for particularly rough shots, and this paired with the larger sensor produces a detail-filled image. It’s especially evident over the older models and competing phones with smaller sensors and no stabilization.
Jacob Krol/CNN Specifically the iPhone 13 Pro and 13 Pro Max — which feature the best camera system we’ve tested on a phone — feature an ambient light sensor th
Why the iPhone 13’s camera is the best of any smartphone we’ve tested
Go To The SourceRead More