CMU depth-sensing camera works in full sunlight

Aug. 11, 2015
A new depth-sensing camera technology from CMU and U of T can capture 3D information in even brightly lit scenes.

IMAGE: A new depth-sensing camera technology developed by CMU and the University of Toronto can capture 3-D information in even brightly lit scenes; a prototype is able to sense the shape of a lit CFL bulb (left) that would create blinding glare for a conventional camera (right). (Image credit: CMU)

A new imaging technology invented by Carnegie Mellon University (Pittsburgh, PA) and the University of Toronto (Toronto, ON, Canada) addresses a major shortcoming of other depth-sensing cameras such as Microsoft's Kinect controller for video games: the inability to work in bright light, especially sunlight.

RELATED ARTICLE: Gestures, cameras, and projectors combine for smart touchscreen environments

The key behind CMU's design is to gather only the bits of light the camera actually needs. The researchers created a mathematical model to help program these devices so that the camera and its light source work together efficiently, eliminating extraneous light, or "noise," that would otherwise wash out the signals needed to detect a scene's contours.

"We have a way of choosing the light rays we want to capture and only those rays," said Srinivasa Narasimhan, CMU associate professor of robotics. "We don't need new image-processing algorithms and we don’t need extra processing to eliminate the noise, because we don’t collect the noise. This is all done by the sensor."

One prototype based on this model synchronizes a laser projector with a common rolling-shutter camera--the type of camera used in most smartphones--so that the camera detects light only from points being illuminated by the laser as it scans across the scene. This not only makes it possible for the camera to work under extremely bright light or amidst highly reflected or diffused light--it can capture the shape of a lightbulb that has been turned on, for instance, and see through smoke--but also makes it extremely energy efficient. This combination of features could make this imaging technology suitable for many applications, including medical imaging, inspection of shiny parts and sensing for robots used to explore the moon and planets. It also could be readily incorporated into most smartphones.

Depth cameras work by projecting a pattern of dots or lines over a scene. Depending on how these patterns are deformed or how much time it takes light to reflect back to the camera, it is possible to calculate the 3D contours of the scene.

The problem is that these devices use compact projectors that operate at low power, so their faint patterns are washed out and undetectable when the camera captures ambient light from a scene. But as a projector scans a laser across the scene, the spots illuminated by the laser beam are brighter, if only briefly, noted Kyros Kutulakos, U of T professor of computer science.

In the prototype using a rolling-shutter camera, this is accomplished by synchronizing the projector so that as the laser scans a particular plane, the camera accepts light only from that plane. Alternatively, if other camera hardware is used, the mathematical framework developed by the team can compute energy-efficient codes that optimize the amount of energy that reaches the camera.

In addition to enabling the use of Kinect-like devices to play videogames outdoors, the new approach also could be used for medical imaging, such as skin structures that otherwise would be obscured when light diffuses as it enters the skin. Likewise, the system can see through smoke despite the light scattering that usually makes it impenetrable to cameras. Manufacturers also could use the system to look for anomalies in shiny or mirrored components. Narasimhan said depth cameras that can operate outdoors could be useful in automotive applications, such as in maintaining spacing between self-driving cars that are "platooned"--following each other at close intervals.

SOURCE: Carnegie Mellon University; http://www.cmu.edu/news/stories/archives/2015/august/depth-sensing-camera.html

Sponsored Recommendations

Optical Filter Orientation Guide

Sept. 5, 2024
Ensure optimal performance of your optical filters with our Orientation Guide. Learn the correct placement and handling techniques to maximize light transmission and filter efficiency...

Advanced Spectral Accuracy: Excitation Filters

Sept. 5, 2024
Enhance your fluorescence experiments with our Excitation Filters. These filters offer superior transmission and spectral accuracy, making them ideal for exciting specific fluorophores...

Raman Filter Sets for Accurate Spectral Data

Sept. 5, 2024
Enhance your Raman spectroscopy with our specialized Raman Filter Sets. Designed for high precision, these filters enable clear separation of Raman signals from laser excitation...

Precision-Engineered Longpass Filters

Sept. 5, 2024
Discover our precision-engineered Longpass Filters, designed for high transmission and optimal wavelength separation. Perfect for fluorescence imaging, microscopy, and more.

Voice your opinion!

To join the conversation, and become an exclusive member of Laser Focus World, create an account today!