Affordable consumer drones have made capturing aerial footage more convenient and accessible. However, shooting cinematic motion videos using a drone is challenging because it requires users to analyze dynamic scenarios while operating the controller. Our task is to develop an autonomous drone cinematography system to capture cinematic videos of human motion. To the end, we designed two autonomous drone cinematography systems using both heuristic-based methods and learning-based methods.
Watching a large number of video clips captured by professional filmmakers is an effective way for beginners to learn video shooting skills and ultimately derive their own creative works. Therefore, we designed an imitation-based system that learns the artistic intention of the cameramen through watching professional aerial videos. We designed a camera planner that analyzes the video contents and previous camera motion to predict future camera motion. Furthermore, we propose a planning framework, which can imitate a filming style by "seeing" only one single demonstration video of such style. We named it "one-shot imitation filming." To the best of our knowledge, this is the first work that extends imitation learning to autonomous filming. Experimental results in both simulation
and field test exhibit significant improvements over existing techniques and our approach
managed to help inexperienced pilots capture cinematic videos.
 Chong Huang, Yuanjie Dang, Peng Chen, Xin Yang and Kwang-Ting (Tim) Cheng, One-Shot Imitation Filming of Human Motion Videos, arXiv. 2019
 Chong Huang, Chuan-En Lin, Zhenyu Yang, Yan Kong, Peng Chen, Xin Yang and Kwang-Ting (Tim) Cheng, Learning to Film from Professional Human Motion Videos , CVPR 2019
 Chong Huang, Zhenyu Yang, Yan Kong, Peng Chen, Xin Yang and Kwang-Ting (Tim) Cheng, Learning to Capture a Film-Look Video with a Camera Drone, ICRA 2019
The heuristic-based method is to employ heuristic aesthetic objectives to automate filming. we designed an Autonomous CinemaTography system "ACT" by proposing a viewpoint quality metric focusing on the visibility of the 3D human skeleton of the subject. We expanded the application of human motion analysis and simplified manual control by assisting viewpoint selection using a through-the-lens method.
 Chong Huang, Fei Gao, Jie Pan, Zhenyu Yang, Weihao Qiu, Peng Chen, Xin Yang, Shaojie Shen, and Kwang-Ting (Tim) Cheng, ACT: An Autonomous Drone Cinematography System for Action Scenes, ICRA 2018
 Chong Huang, Zhenyu Yang, Weihao Qiu, Peng Chen, Xin Yang, Shaojie Shen, and Kwang-Ting (Tim) Cheng, Through-the-Lens Drone Filming, IROS 2018