Sony describes the new sensor technology as:
This image sensor layers the pixel section containing formations of back-illuminated structure pixels onto chips containing the circuit section for signal processing, which is in place of supporting substrates for conventional back-illuminated CMOS image sensors. This structure achieves further enhancement in image quality, superior functionalities and a more compact size that will lead to enhanced camera evolution.
Basically, placing the pixel
layer on top of the sensor circuits reduces the size of the sensor.
Camera modules have proved to be a major bottleneck in the process of
thinning down a device, and the introduction of this sensor may lead to
thinner iPhones eventually, provided battery technology keeps advancing.
The layered structure of the sensor, as 9 to 5 Mac
points out, allows the pixel layer to be completely independent of the
underlying circuitry, meaning that Apple could utilize its custom
silicon (on which more than a thousand engineers are working) to achieve better performance and efficiency.
The sensor can record High Dynamic Range video, which allows the capture of varied colors even in bright environments.
And then there's "RGBW Coding," which
doesn't let badly lit surroundings come in the way of a good image. RGBW
adds an extra white component to the standard RGB representation of
images, which reduces noise and shoots better images. A comparison can
be seen below:
Samples of the 8 MP sensor would start
shipping in March this year, while the 13 MP sensors are scheduled to
ship from June 2012.
Do you think these sensors would be used by Apple in the next iPhone?
0 Comment
Post a Comment