A Detailed Look on Apple iPhone 14 and iPhone 14 Pro Cameras
At its ‘Far Out’ launch event Wednesday, Apple unveiled two of its newest iPhone models, the iPhone 14 and 14 Plus, as well as its higher-end iPhone models, the iPhone 14 Pro and iPhone 14 Pro Max.
The wide and ultrawide modules from the iPhone 13 Pro and iPhone 13 Pro Max devices serve as the basis for the camera systems of the iPhone 14 and 14 Plus. A 12MP image-stabilized sensor with 1.9μ pixels and an f/1.5 aperture is used for the primary camera. The front-facing camera has a 12MP sensor and an f/1.9 focusing lens, which is an improvement over the previous model. Photonic Engine enhances photo performance in mid- to low-light conditions across all cameras, up to 2x on the Ultra Wide camera, 2x on the TrueDepth camera, and an additional 2.5x on the new Main camera.
To give greater detail, preserve subtle textures, improve color, and conserve more information in a shot, Photonic Engine applies the computational advantages of Deep Fusion earlier in the photography process, according to Apple.
Additionally, there is Action Mode, an image stabilization setting for video that more effectively accounts for movement when shooting video.
Here are some sample photos captured on the new iPhone 14 and iPhone 14 plus:
The new main camera on their professional models is a three-lens array that produces 48-megapixel photos. This sensor, which resembles Samsung’s GM2 in many ways and is 65 percent larger than the iPhone 13 Pro, also has a quad-pixel array. It has a second-generation sensor-shift optical image stabilization system that allows it to adjust to the photograph being taken.
By combining four pixels to create 2.44μm pixels, the camera will primarily produce 12 MP photographs with noticeably higher detail and color depth. Still, it can also produce 48 MP ProRaw files from the individual 1.22μm photosites. This indicates that Apple is combining photographs before the demosaicing process, as indicated by the ability to access the complete native resolution in a file generated after image merging.
With Apple’s new camera system, low-light performance can be improved up to twice as much on the main camera, up to three times on the ultra-wide camera, twice as much on the telephoto camera, and twice as much on the TrueDepth camera.
“Photonic Engine enables this dramatic increase in quality by applying Deep Fusion earlier in the imaging process to deliver extraordinary detail, and preserve subtle textures, provide better color, and maintain more information in a photo,” says Apple.
The new Ultra-Wide camera produces photographs with 12-megapixel resolution and 1.4µm pixels. Additionally, the new TrueDepth camera has an f/1.9 aperture lens inside to handle low-light scenarios better, and an upgraded Telephoto lens with a three times optical zoom is also available.
Here are some sample images captured with the iPhone 14 Pro:
According to Apple, the flash has also undergone a redesign and a new Adaptive True Tone flash has been fitted. This flash consists of a nine-LED array whose pattern changes depending on the focal length. The A16 Bionic Chip, the newest and fastest chip yet to be put inside an iPhone, uses 16 billion transistors to provide the phone’s computational photography features.
Every time a photo is shot, the chip powers four trillion processes, assisting Apple’s Phototonic Engine, which employs “strong neurological engine to integrate several images into a single photo.
Night mode, Smart HDR 4, Portrait mode with Portrait Lighting, Night mode Portrait pictures, Photographic Styles to customize the appearance of each photo, and Apple ProRAW are all computational photography modes.
More info on Apple’s website.