“In recent years, a host of Hollywood blockbusters — including “The Fast and the Furious 7,” “Jurassic World,” and “The Wolf of Wall Street” — have included aerial tracking shots provided by drone helicopters outfitted with cameras. Those shots required separate operators for the drones and the cameras, and careful planning to avoid collisions. But a team of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and ETH Zurich hope to make drone cinematography more accessible, simple, and reliable. At the International Conference on Robotics and Automation later this month, the researchers will present a system that allows a director to specify a shot’s framing — which figures or faces appear where, at what distance. Then, on the fly, it generates control signals for a camera-equipped autonomous drone, which preserve that framing as the actors move. As long as the drone’s information about its environment is accurate, the system also guarantees that it won’t collide with either stationary or moving obstacles.”
Related Content
Related Posts:
- MIT engineers 3D print the electromagnets at the heart of many electronics
- MIT scientists use a new type of nanoparticle to make vaccines more powerful
- Researchers discover new channels to excite magnetic waves with terahertz light
- Researchers harness 2D magnetic materials for energy-efficient computing
- This tiny, tamper-proof ID tag can authenticate almost anything
- Accelerating AI tasks while preserving data security
- Engineers develop an efficient process to make fuel from carbon dioxide
- New laser setup probes metamaterial structures with ultrafast pulses
- Physicists trap electrons in a 3D crystal for the first time
- Team engineers nanoparticles using ion irradiation to advance clean energy and fuel conversion