Lidar-equipped, light-carrying drones create optimum 'rim lighting' for moviemakers
In the interests of better creating a type of subject lighting called "rim lighting," researchers at the Massachusetts Institute of technology (MIT; Cambridge, MA) and Cornell University (Ithaca, NY) have created a photographer's and moviemaker's lighting system that includes a small quadricopter drone carrying a white-light source and a lidar system.
In rim lighting, only the edge of the photographer's subject is strongly lit; however, this type of lighting is normally very difficult to achieve and maintain, especially when the subject is moving. The MIT/Cornell system incorporates the moviemaker's video camera, an image-processing unit to determine the instantaneous proportion of rim lighting in the image, and the drone itself, which moves around (using its lidar for positioning) to maintain optimum rim lighting.
The camera-mounted interface for the system was made simple and intuitive for use by nonspecialists.
The group will present their prototype system at the International Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging (Vancouver, Canada; 8-10 August 2014; co-located with SIGGRAPH)
According to Manohar Srikanth, who worked on the system as a graduate student and postdoc at MIT and who is now a senior researcher at Nokia, he and his coauthors—MIT professor of computer science and engineering Frédo Durand and Cornell's Kavita Bala, who also did her doctorate at MIT—chose rim lighting for their initial experiments precisely because it's a difficult effect.
Easy operation
To use the lighting system, the photographer indicates the direction from which the rim light should come, and the drone flies to that side of the subject. The photographer then specifies the width of the rim as a percentage of its initial value, repeating that process until the desired effect is achieved.
Thereafter, the robot automatically maintains the specified rim width. "If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he's looking 90 degrees away from you, then he's exposing his chest to the light, which means that you'll see a much thicker rim light," Srikanth says. "So in order to compensate for the change in the body, the light has to change its position quite dramatically."
In the same way, Srikanth says, the system can compensate for the photographer's movements; the calculations occur at a rate of 20 per second.
The researchers tested their prototype in a motion-capture studio, which uses a bank of high-speed cameras to measure the position of specially designed light-reflecting tags with millimeter accuracy; several such tags were affixed to the helicopter.
The purpose of the tests was simply to evaluate the control algorithm, which performed well; rim lighting doesn't require the millimeter accuracy of the motion-capture studio, notes Srikanth. "We only need a resolution of two or three centimeters," he says.
Source: https://www.regaalo.com/collegenews/feed/mit-top-news-massachusetts-institute-of-technology-news
John Wallace | Senior Technical Editor (1998-2022)
John Wallace was with Laser Focus World for nearly 25 years, retiring in late June 2022. He obtained a bachelor's degree in mechanical engineering and physics at Rutgers University and a master's in optical engineering at the University of Rochester. Before becoming an editor, John worked as an engineer at RCA, Exxon, Eastman Kodak, and GCA Corporation.