Table of contents:
Google has just released a new night mode for its phones, those of the Pixel series, with which it wants to offer users the possibility of taking photos in poorly lit environments, achieving good results. Everything works through techniques called computational photography.
The new function has been introduced under the name of “Night Sight” and is, according to the first experts who have tried it, one of the most advanced and effective technologies to capture photos in scenes with poor lighting or even at night.
This new technology, which updates the previous night mode, will only be available - for now - for Google devices. That is, for the Pixels. So you can only enjoy its effects and advantages if you have a home mobile in your pocket.
iPhone XS with SmartHDR (left) and Pixel 3 with Night Sight (right)
How Night Sight Mode Works
As Google has explained through its blog, the Night Sight mode performs different calculations before carrying out a capture. The technology takes into account the movement of the hand that is holding the phone, but also all the objects that are in the scene of the photograph and logically, the lighting of the scene. In this way, Night Sight can calculate how many exposures to take and the duration of them.
In dimly lit scenes, that is, in the dark, the lack of light is compensated by the cameras taking more time to capture the light. Light sensitivity is also increased. But this has its downsides: although the results may improve a bit, images are often made worse by motion blur or noise.
While the flash can be useful for lighting the scene, it is often quite annoying and produces unflattering images. This is worse when it is impossible to use the flash in certain places. This also fails when capturing images of landscapes or objects that are far away.
Before capturing the image, then, the Night Sight mode is responsible for assessing the shaking of the hand and the movement in the scene. What do you do in this case? Well, if there is no movement in the scene, the dark mode is able to spend more time minimizing noise; On the other hand, if the device is moving or there is movement in the scene, Night Sight spends less exposure time.
The system is also capable of ensuring that, if there is an object that is moving in the scene, the snapshot does not look bad. Instead of taking a photo with too much light and blur, Night Sight captures a burst with many dark photos, making them sharp. It is a way to prevent motion blur from occurring. Then add light to the image and the result is a much clearer and brighter snapshot.
The camera can take a total of 15 photos in just 6 seconds. And from there, use artificial intelligence to gather information and finish setting up a well-lit image.
When will this new feature be available?
As it has advanced through its blog, the new Night Sight mode will be incorporated into the new Google Pixel devices through an update to the Google Camera application. But beware, it will only be available to users of these computers. If you want to see what the effects of Night Sight are on the captures, you can take a look at this album shared on Google Photos.