Using an ultra-slow-motion camera that records at 24,000 frame-per-second, researchers with the Swiss Federal Institute of Technology in Lausanne (EPLF) were able to capture a pulse of light as it bounced between a series of aligned mirrors.
According to EPFL School of Engineering’s Advanced Quantum Architecture Laboratory head Edoardo Charbon, the MegaX camera behind this new video is the by-product of around 15 years of single-photon avalanche diodes (SPADs) research.
Ordinarily speaking, light is not visible during flight, but the photons do shed particles into the air that, using the right hardware and software, can be captured as in the video shown above. The light was recorded using MegaX, a camera that can produce 3D representations and ‘perform in-depth segmentation of those representations,’ EPFL explains.
The camera likewise has a very fast shutter speed — up to 3.8 nanoseconds — plus it has a very large dynamic range. As well, the pixel size offered by the MegaX is around 10 times larger than a standard digital camera’s pixel size at 9 µm — though the team is working to reduce this size down to 2.2 µm.
When talking about MegaX earlier this year, Charbon explained that the camera works by converting photons into electrical signals. Of note, this camera is able to measure how long it takes a photon to strike its sensor, giving it distance information; this feature is commonly known as time-of-flight.
By combining the typical three dimensions with time-of-flight, MegaX is something of a 4D camera, giving it capabilities beyond that of the average camera.
A new study published on July 18 builds upon this past research, detailing the first time scientists have captured 4D light-in-flight imagery using the time-gated megapixel SPAD camera technology. This is in contrast to 3D light-in-flight capture, which has been achieved using different varieties of camera hardware.
The study explains that to capture the bouncing pulse of light, a machine learning technique took the place of other functions that may have otherwise been utilized, such as dark noise subtraction and interpolation. The process involved using time-of-flight and trajectory data combined with machine learning algorithms to plot the 3D path of the light.
Charbon recently explained to Digital Trends that this new study details the use of machine learning and the 4D data to reconstruct the position of the light pulses. Though this may be something of a novelty to the average person, the technology could eventually be utilized in everything from robotic vision to physics and virtual reality systems.
Of note, the researcher explained that all of the processes involved in capturing the bouncing light pulse were done on the MegaX camera. An abstract of the study is available here; the public can also access the full PDF of the study here.