Updated April 20: article originally posted April 18.

It may be an expected upgrade, but the next iPhone will push the smartphone camera further than Apple’s previous endeavours with its new hardware and revolutionary software. Now we have more details on what Tim Cook and his team have planned.

Apple is expected to use Sony’s latest camera sensor technology. Thanks to its stacked sensor technology, splitting the photo diodes and pixel transistors into two distinct layers, more light can be captured for each pixel. Sony claims up to twice the amount of light can be captured than its current sensors.

Previously, the cutting-edge technology was limited to the largest of the new models. If that’s the case, this new sensor and the increased low-light capability may be reserved for the iPhone 16 Pro Max, which at the very least allows Apple to make a case for the most expensive iPhone beyond “the screen is a bit bigger.”

Apple will also increase the optical zoom available on the higher-tier iPhone 16 Pro and iPhone 16 Pro Max handsets. This will be facilitated by using a tetraprism lens, which—much like a periscope—uses prisms to fold the path of the light to create a longer lens than the depth offered by the smartphone.

The iPhone 15 Pro allows x3 optical zoom using such a system, but the iPhone 16 Pro and 16 Pro Max will see an upgrade to x5 optical zoom and the expectation of x25 digital zoom as an option when taking photos.

Update: Saturday April 20: Apple is pushing the envelope with Sony’s new stacked sensor, but the quest to gather more information through more megapixels with more accurate image sensors continues.

Adam Juniper reports on Apple’s plans for the iPhone 17’s camera with a headline specification of a 144-megapixel camera. This would be made up of three lenses, all sporting 48-megapixel sensors. The iPhone 16 family should upgrade all the main cameras to 48 megapixels, while the iPhone 17 Pro will build on that with the 48-megapixel technology coming to the telephoto and the ultrawide camera.

This doesn’t necessarily mean Apple will default to a 48-megapixel image. The extra information will allow for advanced techniques currently in use, such as pixel binning—taking four pixels from the sensor and combining them into one single but more accurate pixel—and upcoming AI-based techniques Apple will no doubt debut at its Worldwide Developer Conference in May 2024.

Many of the competing Andorid smartphones boast of 100-megapixel cameras. Apple may no match that this year, but it will have the option for future iPhones.

In terms of software, Apple will finally join the AI revolution with the iPhone 16 family of smartphones with a raft of AI-infused features to improve how images are captured, processed and edited by tens of millions of consumers. While AI routines have been part of Apple’s camera suite in previous years, the recent push by Android manufacturers (particularly Samsung and Google) to brand their phones as AI-powered phones has left the iPhone behind.

The iPhone 16 family will be Tim Cook’s first chance to sell hardware with a particular focus on AI, and you can expect the visual difference it makes to photos to be both an easy emotion to sell and a powerful demonstration on stage when the next iPhone is launched in September.

Before then, we’ll get our first look at Apple’s AI efforts and hints at what’s to come at June’s Worldwide Developer Conference.

Now read about Apple’s research paper that details how AI will be able to read your screen and help you navigate your iPhone…

Share.
Exit mobile version