This photo taken with an iPhone made a lot of noise on the agenda.

This photo taken with an iPhone made a lot of

iPhone This photo taken with made a lot of noise on the agenda. An error in the “Computational Photography” infrastructure is behind this.

First of all, what is Computational Photography? Digital cameras are designed for the highest photo quality. These models are expected to take very good photos and videos. This is why companies have a wide working area at their disposal, since the infrastructure for these machines is available. very large image sensors can be placed. Again, since the infrastructure is available, there are large and very large cameras for cameras. quality optics i.e. lenses can be designed. When the large sensor and high quality optics come together, truly incredible image quality can be achieved. However, these two elements cannot be brought together on smartphones. The large sensor cannot be used because there is no space inside the phones. Again, large optics cannot be preferred because there is no space inside the phone. What about smartphones? How is image quality constantly improved?

YOU MAY BE INTERESTED IN

Nowadays, companies are pushing their limits when it comes to hardware. That’s why image quality now looks more at the software side. At work Computational Photography This is exactly where it comes into play. This infrastructure is basically maximum photo quality focuses on, for this, smartphones, multiple photos for a single photo pulling. You press the shutter button once, and the phone takes 10 different photos for you. Since this shooting process takes place within milliseconds, users do not need to wait. So what exactly is behind taking multiple photos?

The answer to this is in short as much detail/data as possible to capture. When you press the shutter button, the phone first takes shots with multiple exposure values. Thus, both dark and bright parts of the photo posed correctly The phone is in your hand. iPhone above photo It resonates exactly on this basis.

At first glance, there does not seem to be any problem in the photograph of the woman standing in front of these two different mirrors. However, when you look more carefully, it becomes clear that the woman’s arms are in different places.. So how does this happen? How can we see three different arm positions in a normal photograph taken by simply pressing the shutter button once? This is not a Photoshop work There is a Computational Photography errorthe system does not detect mirrors and thinks there are three different people in the frame.

As we said above, modern iPhones do not work even if you press the shutter button once. It takes a lot more photos in the background. The same thing happens here. While the shutter button is pressed, the woman moves her arms quickly while looking at herself in the mirrors, and the Computational Photography infrastructure also He chooses the best moment he captured for all three locations and combines them in a single photo. The infrastructure always uses the sharpest and most accurately exposed photos, which can create such interesting situations from time to time.



lgct-tech-game