The achievement is likely to support studies of fragile biological materials, such as the human eye, that could be damaged or destroyed by higher levels of illumination. The development could also have applications for military surveillance, such as in a spy camera that records a scene with a minimum of illumination to elude detection.
To create detailed images using single photons, electrical engineer Ahmed Kirmani of the Massachusetts Institute of Technology in Cambridge and his colleagues developed an algorithm that takes into account correlations between neighbouring parts of an illuminated object as well as the physics of low-light measurements. The researchers describe their work online today in Science1.
However, the algorithm developed by Kirmani and his colleagues provides that information using one-hundredth the number of photons required by existing light detection and ranging (LIDAR) techniques
I love when we humans advance science and technology.
A lot of the time though, I end up hating what it is used for.
And when I mention Deep Fields I always add on my main complaint, that NASA doesn't use Hubble to make more of them, at least devote a week a month to making Deep Fields. There've been what, three so far? More, more, and now with this new single-photon detecting tech, deeper, deeper (fields).