Year after year, the leading mobile phone brands renew their high-end models. And everyone aims to improve their cameras, but whoever does it better is always Manzana. If for secrets in its software or hardware, looking at a photo, those taken with Motorola or Samsung devices do not even approach the quality of those of a iphone.
This year, with the arrival of the iPhone 14, Apple entered the megapixel (MP) race, something it had always refused to do. The iPhone 14 Pro and Pro Max debuted, for the first time, at 48 MP camera sensor with Pixel Binning technology, which makes better use of per-pixel lighting. It’s 65% larger than last year, with OIS built into the sensor. The iPhone 13 had 12MP.
This 4 pixels-in-1 combination (captured at 12 megapixels) is something familiar in its Android rival, though Apple relies on its ISP – the key to placing exact color in a photograph – to give it more sense. In addition, Apple offers double the brightness on all sensors (in low light conditions).
As for the videothe new “action mode” is intended to help capture smooth-looking video that adjusts for significant jitter, motion, and judder, even when video is captured in the middle of the action.
Also improved is the “Cinema Mode”, first introduced on last year’s iPhone 13 series: you can now record in 4K at 30fps and 4K at 24fps using the feature.
Then, Apple has done it again. The iPhone 14 has the best camera. But, as it turned out, the iPhone 15 will come with lenses that will represent a technological revolution.
Apple is responsible for the design of the iPhone camera, but its components come from other manufacturers. Sony, historic supplier of photographic sensors in La Manzanita, is preparing to offer you its latest generation sensor.
What will the iPhone 15 camera be like?
According to the Japanese newspaper Nikkei, it is the first dual-layer pixel transistor stacked CMOS image sensor (2 layer transistor pixels). In conventional CMOS sensors, the photodiodes and transistors are on the same substrate, in this new sensor they are on different substrate layers.
As Sony explains, this technology allows for that expand the dynamic range and reduce noise in photos. This will avoid underexposure and overexposure in environments that have a mix of bright and low light. Thus, better images can be captured in backlit situations and low-light environments. And also, logically, indoors.
Although the sensor is a fundamental part of the photos taken with the iPhone, it is not the only thing that is part of this device. The optical system with a seven-element architecture and computational photography are very important. This year, with the iPhone 14, Apple released the Photonic Engine, an engine that receives the information collected by the sensor before the Deep Fusion compression algorithm intervenes.
The iPhone 15 version is likely to come with more new features, such as a new processor that outperforms the current A16 Bionic. The truth is that to see these novelties we will have to wait, it is estimated that until September next year.
SL
Source: Clarin
Linda Price is a tech expert at News Rebeat. With a deep understanding of the latest developments in the world of technology and a passion for innovation, Linda provides insightful and informative coverage of the cutting-edge advancements shaping our world.