Google explains how the Live HDR+ feature on the Pixel 4 and 4a works

Smart Android And Trik-Commenting on Andorid indeed never endless, because smart devices this one is often updated every certain amount of time. So that the market can always be garapnya menerinya with pleasure. And it is not denied if this device has become the lifestyle of each society. To not wonder if the 6th business information and many are turning to mobail smartphone. With Android which thoroughly dominated the mobile industry, choosing the best Android smartphone is almost identical to choose the best smartphone, period. But while Android phones have few real opponents on other platforms, internal competition is intense.

From the sleek devices impress with the design premium, up to a full plant furniture features, to a very good device, and affordable mobile phone has a heavy weight, the Android ecosystem inhabited by a diverse range of attractive mobile phone Google explains how the Live HDR+ feature on the Pixel 4 and 4a works Google explains how the Live HDR+ feature on the Pixel 4 and 4a works,But "oversize" are subjective, and sometimes pieces of the specification and a list of features is not enough to get an idea of how good a phone. In this roundup, we look at the absolute best-the Android phone you can't go wrong with. The habits of young people or to accentuate trand blindly lifestyle, make this a medoroang this clever device industry vying to do modifications to the device, with a distinctly vitur vitur-tercanggihnya. So it can be received over the counter Google explains how the Live HDR+ feature on the Pixel 4 and 4a works

The new Pixel 4a supports Live HDR+ and Dual Exposure, features first introduced with the Pixel 4 (note that these won’t be backported to older devices). The tech giant published a detailed blog post explaining how the two features work.

Google explains how the Live HDR+ feature on the Pixel 4 and 4a works

First, what is Live HDR+? It shows a real time preview of what the final HDR+ photo will look like. Note that this is just a preview derived using a different algorithm. The real HDR+ takes 3-15 underexposed photos (to reduce noise), aligns them and merges them.

After the merge, tone mapping is applied to produce the final photo where details in both highlights and shadows are visible. The phone computes a 2D histogram to achieve that, which is interesting to see.

Google explains how the Live HDR+ feature on the Pixel 4 and 4a works

However, current mobile chipsets don’t have the computational power to do that 30 times per second. Instead, a dash of AI is used. The image is sliced into small tiles and the AI predicts the tone mapping for each of them. Then every pixel on the viewfinder is computed as a combination of the tone maps from the nearest tiles.

Here’s a comparison between the predicted image and the actual HDR+ result. It doesn't get it quite right, but it looks pretty close (especially since you'll be viewing this on the phone's screen).

Predicted HDR image (seen on the viewfinder) vs. actual HDR+ result
Predicted HDR image (seen on the viewfinder) vs. actual HDR+ result

Balancing highlights and shadows is done automatically by HDR+. The Dual Exposure sliders give you manual control over the process, so you can get the desired look for your photo in camera. Traditionally, this is something you would do afterwards by processing the RAW file.

Same scene, different adjustments to the Dual Exposure sliders
Same scene, different adjustments to the Dual Exposure sliders

If you want a more detailed explanation of how all of this works, you can follow the Source link to Google's blog post for more.

Source

Let's block ads! (Why?)

Read:


Subscribe to receive free email updates:

0 Response to "Google explains how the Live HDR+ feature on the Pixel 4 and 4a works"

Post a Comment