using camera to accurately measure very low luminances

Astro photography? Microscopic images? Other technical shots? Post your tips or questions here

Moderators: Bob Andersson, Thomas, Bjorn van Sinttruije

using camera to accurately measure very low luminances

Postby spacediver » Sat Oct 11, 2014 9:02 pm

Hi all,

first post here, but this seemed like an appropriate place to share. I've also posted this up on the display calibration at avsforums, and a lot of the learning I did for this project was inspired by websites dedicated to astrophotography.

One of the challenges in calibrating displays is being able to measure very deep black levels. Most instruments are not sensitive enough to capture a reliable signal at luminances below 0.01 or even 0.001 cd/m^2.

After brainstorming a number of ideas with a member of hardforum named flood, we had some good success by using a camera with long exposure times to do the job. Instead of giving a play by play account of our learning experience (fully documented starting on page 10 of this thread), I'll just report the general methodology and some relevant data. This was, and continues to be, a fun experience - I knew absolutely nothing about cameras or photography before this - turns out they are very cool instruments!

Hardware:

Sony GDM-FW900
i1Display Pro
Canon EOS 450D

General method:

The sensor of a camera is comprised of millions of photosites. In a bayer filter, which is a common design, each of the photosites is covered with a red, green, or blue filter. There are twice as many "green photosites" as there are red or blue, for reasons the same reasons that 4:2:2 chroma subsampling is implemented in video encoding. A RAW image contains data for each individual photosite, and is typically converted into a full color image, where the color of each photosite is estimated using a demosaicing algorithm.

Our approach was to extract each channel independently, and measure its response to our display primaries, while concurrently measuring the trichromatic quantities (X Y Z) of these primaries with a colorimeter. By doing this, we could then create a 3x3 matrix that could convert the camera's filter responses into X Y Z values. See this post for the precise details of this approach. Basically, you are turning your camera into a colorimeter that is specialized for your own display (it will not be accurate on other displays).

To analyze the RAW files, dcraw was used to extract 16 bit linear TIFF files, using the command "dcraw.exe -D -4 -T filename.cr2". The camera I was using is a second hand Canon EOS 450D, which has a 14 bit analog to digital converter.

First order of business was to figure out which photosites corresponded to which channel. To do this, I examined the images of red, green, and blue test patterns.

Here are grayscale images of the red, green, and blue test patterns.

Image

Using this information, I was able to extract each channel with Matlab, and calculate the mean pixel value (MPV) for each channel, and generate a 3x3 matrix. To reduce noise, multiple images were taken of each primary. The images were also screened on a channel by channel basis to ensure that the sensor was not saturated or "bottomed out". To illustrate these ceiling/floor effects, see the image below, where I've plotted mean pixel value (labeled APV) against a number of different shutter speeds (plotted nominally, not in actual duration), along with histograms. Note how only the highlighted values show no saturation effects. You might also notice that the bottoming out occurs at around 1000. This is because Canon adds an offset to avoid clipping at the low end. I measured this offset by taking multiple black frames (with lens cap on, under a dark blanket, in a dark room) as 1024.8, and for all the analysis, subtracted this value as an initial step. The high end saturates at around 1450 or so (I haven't actually measured it precisely). So the actual range of digital numbers is around 13,500.

Image

The calibration matrix was obtained with the following conditions:

Aperture: f/5.7
Focal length: 28 mm
ISO: 100
Shutter speed: 1 second.

Camera was placed so that the lens was flush with the display, and a dark blanket was used as a "hood" to enclose the area. Lights were turned off. Blackout cloth was used to block light from windows. A laptop was used to control the camera remotely, using the EOS canon utility. This proved very useful, as you could set the camera to take multiple images at specified intervals and leave the room. HCFR was used to record XYZ values from an adjacent patch of the screen with a colorimeter.

Once the calibration matrix was generated, all subsequent measurements were taken with the same parameters and conditions, with the exception of shutter speed, which needs to be varied depending on the luminance of the test pattern. If, for example, a shutter speed of 100 seconds was used, one would simply divide the resultant trichromatic values by 100.

The noise levels of this camera are quite impressive.

Here is the distribution of values representing the difference of two successive images taken of a patch of wall. Standard deviation (of pixel value) is about 20.

Image


To get a sense of how much MPV noise there is across multiple images, I recorded 20 successive images (1/8 second shutter speed) of a white diffuse material in a room lit by a warmed up incandescent bulb with diffusing cover. Here is a plot of MPV across the images. Standard Error of the Mean is about 0.081.

Image

Having established the precision of these measurements, I verified that the colorimeter (i1 display pro) and camera yielded similar trichromatic values. The concordance across multiple luminances was about 1-2 %, and could likely be improved even further by taking camera and colorimeter measurements from the same patch of screen (or finding two patches that are identical - something I tried to do but could have done a better job with). I also used a borrowed Konica-Minolta LS-110 to verify the accuracy of my i1 display pro from the max luminance of my display down to the lowest luminance the LS-110 could read (0.01 cd/m^2). The LS-110 matched the i1 display pro across this entire range (it consistently recorded about 2% lower luminance than i1 display pro).


I then used a 304 second exposure to measure my black level (a 30 second exposure was too close to bottoming out for my liking, but still produced a very similar result)

The calculated luminance was 0.0006362 cd/m2

So all this seems to suggest that a camera can accurately measure luminance and chromaticity at luminance levels approaching dark adapted human vision (and likely beyond). This is what motivates and allows amateur astrophotography.


Some other interesting data. After warming up my display for a number of hours, I loaded a dark test pattern, and took 500 successive images (18 seconds apart, with 16 sec exposure). As this was done before I had turned my camera into a colorimeter, and established the proper conditions, the following is plotted in MPV across all channels (and assuming that the white point balance of this image didn't change significantly during the hours it was displayed, the MPV is proportional to luminance). The point at which the image stabilizes happens at almost exactly 1 hour after loading the pattern.

Image

The overall amount of drift (relative to the stable period) is about ± 4%, so it's not severe, but it is interesting nonetheless. Remember, this represents drift that occurs after the display has fully warmed up. I repeated the experiment with a higher test pattern, and got different results - the luminance started high, and then stabilized after about 30-45 minutes.

This approach of using a camera as a colorimeter can probably be used to do some pretty cool things. Two things I'm thinking about doing is using it to measure screen uniformity (each pixel of the camera is essentially a colorimeter), and to measure things like the line spread function of a display.

Anyway, thought a few of you might find this interesting. Any thoughts, critical or otherwise, are welcomed. And I'd also like to reiterate that this whole thing was flood's idea.
spacediver
 
Posts: 2
Joined: Sat Oct 11, 2014 8:51 pm

Return to Technical and scientific photography

Who is online

Users browsing this forum: No registered users and 1 guest