Google updated their exposure bracketing algorithm for HDR+ to include an additional long exposure frame and Night Sight to include 3 long exposure frames. In April 2021, Google Camera v8.2 introduced HDR+ with Bracketing, Night Sight with Bracketing and Portrait Mode with Bracketing. This feature was made available for Pixel 4, and has not been retrofitted on older Pixel devices due to hardware limitations. 'Live HDR+' mode uses Dual Exposure Controls, with separate sliders for brightness ( capture exposure) and for shadows ( tone mapping). HDR+ live uses the learning-based AWB algorithm from Night Sight and averages up to nine underexposed pictures. Starting with the Pixel 4, Live HDR+ replaced HDR+ on, featuring WYSIWYG viewfinder with a real-time preview of HDR+. HDR+ enhanced on the Pixel 3 uses the learning-based AWB algorithm from Night Sight. HDR+ enhanced captures increase the dynamic range compared to HDR+ on. It is believed to use underexposed and overexposed frames like Smart HDR from Apple. HDR+ enhanced is similar to HDR+ from the Nexus 5, Nexus 6, Nexus 5X and Nexus 6P. Like Night Sight, HDR+ enhanced features positive-shutter-lag (PSL): it captures images after the shutter is pressed. Unlike HDR+/HDR+ On, 'HDR+ enhanced' mode does not use Zero Shutter Lag (ZSL). HDR+ was introduced on the Nexus 6 and brought back to the Nexus 5. HDR+ also reduces shot noise and improves colors, while avoiding blowing out highlights and motion blur. HDR+ also uses Semantic Segmentation to detect faces to brighten using synthetic fill flash, and darken and denoise skies. When the shutter is pressed the last 5–15 frames are analysed to pick the sharpest shots (using lucky imaging), which are selectively aligned and combined with image averaging. HDR+ takes continuous burst shots with short exposures. Unlike earlier versions of high-dynamic-range (HDR) imaging, HDR+, also known as HDR+ on, uses computational photography techniques to achieve higher dynamic range. The Pixel 4 introduced the Pixel Neural Core. The Pixel 2 and Pixel 3 (but not the Pixel 3a) include the Pixel Visual Core to aid with image processing.
The first generation of Pixel phones used Qualcomm's Hexagon DSPs and Adreno GPUs to accelerate image processing.
Starting with Pixel devices, the camera app has been aided with hardware accelerators to perform its image processing.