“In recent years, a host of Hollywood blockbusters — including “The Fast and the Furious 7,” “Jurassic World,” and “The Wolf of Wall Street” — have included aerial tracking shots provided by drone helicopters outfitted with cameras. Those shots required separate operators for the drones and the cameras, and careful planning to avoid collisions. But a team of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and ETH Zurich hope to make drone cinematography more accessible, simple, and reliable. At the International Conference on Robotics and Automation later this month, the researchers will present a system that allows a director to specify a shot’s framing — which figures or faces appear where, at what distance. Then, on the fly, it generates control signals for a camera-equipped autonomous drone, which preserve that framing as the actors move. As long as the drone’s information about its environment is accurate, the system also guarantees that it won’t collide with either stationary or moving obstacles.”
Related Content
Related Posts:
- Cobalt-free batteries could power cars of the future
- Researchers 3D print components for a portable mass spectrometer
- A blueprint for making quantum computers easier to program
- MIT researchers discover “neutronic molecules”
- MIT scientists tune the entanglement structure in an array of qubits
- New software enables blind and low-vision users to create interactive, accessible charts
- Researchers 3D print key components for a point-of-care mass spectrometer
- Self-powered sensor automatically harvests magnetic energy
- This 3D printer can figure out how to print with an unknown material
- With inspiration from “Tetris,” MIT researchers develop a better radiation detector