Willy Nolan

Project

Project Title Description

Today High Dynamic Range Images, or HDRI are commonplace. Many smartphones contain built-in support for HDR images and some even default to using HDR to capture images.

Fundamentally dealing with HDR data is all about capturing the radiance of the scene. A naive thought would be that pixels appearing twice as bright in an image relate to areas in the photographed scene with twice as much radiance.

The second featured image shows this is not the case. A tremendous range of radiance values have to be mapped to a significantly smaller range of pixel intensities. The image also shows the pipeline that creates this non-linear mapping.

For many reasons this is not true, and one challenging reason is the non-linear response function of cameras. This function is not usually published by camera makers who consider it a trade secret.

Recovering this function allows more than just the the common HDRI-style images (and their hyper-realistic tone-mapped counterparts), including being able to achieve more realistic motion-blur, color-correction and edge detection in images.

With a stack of aligned images at different exposures, the non-linear response function can be recovered in the following process.

$$Z_{ij} = f(E\Delta t)$$ Where $i$ indexes over pixel locations and $j$ indexes over the different equations.

With some manipulation this simplifies to: $$g(Z_{ij}) = \ln E_i + \ln \Delta t_j$$ Where $g = \ln f^{-1}$

Written in this form $g$ (and therefore eventually $f$) can be recovered up to a scale by using the $SVD$ to minimize a related objective function.

With this function recovered the radiance can be visualized a number of ways (for instance usually by scaling the radiance to the display device).

This project implements the following academic papers: