The Dual Scanning Laser Camera
paper by Sen et al. describes a way to virtually swap a camera and a projector
in such a way that a photograph can be taken from the point of the view of the
projector even though there is no camera there.
After developing a significant body of theory their final tour de force is
to demonstrate how their method can be used to read the face of a playing card
even though the only light reaching the camera is reflected from diffuse surfaces
and so appears to contain little usable information about the face of the card.
This web page demonstrates how to reproduce that feat with about $50 of equipment
and a few dozen lines of code.
Firstly here is a photo of the apparatus:
The key things to observe are:
- The playing card
- A white tube of paper shielding a photoresistor from direct line of sight to the card
- A diffuse white surface, namely a notepad, in which the card will be observed
As the photocell receives light only from the notepad reconstructing the card looks
like an impossible task.
Any kind of reflection in
the notepad would appear to be so hopelessly blurred so that there is no chance of
recovering the identity of the card. But there is one more vital piece of equipment to make it all work:
- A laser pointer mounted on a cheap pan-tilt head constructed from two servos.
Here is a diagram of the setup:
The idea is that when the laser illuminates a point on the playing card a glow will be
seen on the notepad. Although the glow is diffuse it does give an accurate reflection
of the region of the card where the laser is aiming. If the laser is pointing at
a light coloured part of the card then the glow will be brighter. The photocell
is directed at the notepad and registers the brightness of the glow. As the laser
scans across the card we rasterise an image by measuring the resistance of the
photoresistor. So even though the photocell is a single sensor and receives light
from many different directions, we can still measure the reflectivity of a small
area on the playing card because the illumination is so narrow.
So much for the theory, does it work? Well here's the result:
The Seven of Spades.
Some things to note: the laser beam was several millimetres wide limiting the resolution
I could obtain. The servos were cheap hobby servos, not really suitable for high precision work. Some some scan lines show signs of where the motors occasionally became stuck.
The line down the left hand size is caused by the fact that the setup was recording
photoresistor values even while the laser was 'flying back' across the card. So
the image data to the left of the line is a horizontally squeezed and inverted
image of the entire card. Note that laser light is monochromatic, I'd need to use
focussed coloured beams, or white beams with filters, to obtain colour pictures.
The image was lightly processed to bring the black and white points to sensible values.
I was measuring voltages across a potential divider made of of a resistor and the
photoresistor and this, combined with the nonlinearity of the photocell response, meant
I had to at least rescale the value.
I also resized the image using high quality interpolation. ("Computer: Enhance!")
The formatted raw values are here.
To control the servos I used an Atmel AVR ATMega32 microcontroller board from
ERE which has
a built in analogue to digital converter. Just about any microcontroller, such as
a basic stamp, would do. In fact, it's probably possible to cut out the microcontroller
and use the parallel port directly with an ADC chip.
There's a more abstract way of looking at this. The Maxwell equations which govern
the evolution of electromagnetic fields are time-reversal invariant. This means that
if you have a sensor measuring how much light is falling on it from a light source then
you'll get the same reading if you swap the source and sensor. In a sense you can think
of the source as a sensor and vice versa as you'll get the same result. This is known as Helmholtz Reciprocity (apparently).
In this case it's
as if the photocell is a lamp and the laser collects light along a narrow beam. Looked at
from this perspective the notepad acts like a bounce card ensuring that the lighting on the
playing card is diffuse. At first it seems like insisting on pointing the photocell at the
notepad makes things harder. But actually it acts as a virtual bounce card making the
virtual lighting of the card more uniform. In fact, if you check the first
image I produced, with the sensor pointing straight at the subject, you can see a large
highlight spoiling the image. This is a virtual reflection of the photocell. Using the notepad
eliminates these artifacts.
Swapping camera and light source isn't unusual in computer graphics. Ray-tracing software typically
simulates rays travelling from the camera to the light source (except when using techniques like
photon mapping). This camera works just like a ray-tracer.
A Virtual Photo of a Rubber Duck
Reading a playing card that isn't in your field of view is actually quite easy
if you can control the lighting. It doesn't require fancy physics or algorithms
at all. Note also that this is nothing new. It's almost identical to the way a barcode
scanner builds up an image of a barcode by scanning a laser across it.
The value of the Sen at al.
not that it allows seeing round corners
but that they show how to extract an N pixel image
using only log(N) step by using a camera instead of a photocell.
When the paper first appeared on Slashdot there
was much discussion about security implications. I really don't think there are any.
This method may allow you to see round corners, but only at the expense of placing another
piece of equipment where the camera would have been, and in this case the virtual
camera emits light
making it unsuitable for covert operations.
- Assembler source for ATMega32
- My first image. Used fewer steps for scanning. You might just recognise it as an Altoids tin.
- The raw data straight from the microcontroller's RS232 port is here.
- Video of the camera in action. The photocell is visible in this video. Note the
easy to see glow on the notepad and how it varies over time.
Write feedback here.
Feel free to email me (Dan Piponi) at stirfry (at) sigfpe.com.
Back to Home.