Drone-based Cinematography

Filming human motions with a camera drone is a very challenging task, because it requires the cameraman to manipulate a remote controller and design the desired image composition in real time. To help the inexperienced flyers to capture the cinematic videos, we propose several methods to auto aerial filming from three aspects: learning-based, heuristic-based and interactive-based.

Watching a large number of video clips captured by professional filmmakers is an effective way for beginners to learn video shooting skills and ultimately derive their own creative works. Therefore, we apply “watching - learning - imitating” strategy to camera planning in automated filming tasks. We model the decision-making process of the cameraman with two steps: 1) we train a network to predict the future image composition and camera position, and 2) our system then generates control commands to achieve the desired shot framing.

Learning-based Method

[1] Chong Huang, Yuanjie Dang, Peng Chen, Xin Yang and Kwang-Ting (Tim) Cheng, One-Shot Imitation Filming of Human Motion Videos, arXiv. 2019

 

[2] Chong Huang, Chuan-En Lin, Zhenyu Yang, Yan Kong, Peng Chen, Xin Yang and Kwang-Ting (Tim) Cheng, Learning to Film from Professional Human Motion Videos , CVPR 2019

[3] Chong Huang, Zhenyu Yang, Yan Kong, Peng Chen, Xin Yang and Kwang-Ting (Tim) Cheng, Learning to Capture a Film-Look Video with a Camera Drone, ICRA 2019

The heuristic-based method is to employ heuristic aesthetic objectives to automate filming. The commonly-used metrics (e.g subject’s visibility) rely on accurate human pose information. Existing methods use the wearable motion sensors to obtain the human pose, but they are restricted to the indoor environment due to the reliance on infrared sensors. Our proposed method can autonomously capture cinematic shots of action scenes based on limb movements in both indoor and outdoor environments. 

Heuristic-based Method

[1] Chong Huang, Fei Gao, Jie Pan, Zhenyu Yang, Weihao Qiu, Peng Chen, Xin Yang, Shaojie Shen, and Kwang-Ting (Tim) Cheng, ACT: An Autonomous Drone Cinematography System for Action Scenes, ICRA 2018

Interactive-based methods (i.e. through-the-lens methods) refer to control the position of the drone by interacting with the locations of on-screen targets. Compared with the previous interactive-based methods, our proposed method can estimate the subject’s 3D pose and allows a cameraman to conveniently control the drone by manipulating a 3D model (created from the pose of the subject) in the preview, which closes the gap between the flight control and the viewpoint design.

Interactive-based Method

[1] Chong Huang, Zhenyu Yang, Weihao Qiu, Peng Chen, Xin Yang, Shaojie Shen, and Kwang-Ting (Tim) Cheng, Through-the-Lens Drone Filming, IROS 2018