Detection-Aware Trajectory Generation for a Drone Cinematographer

Video filming by employing drones is being used for personal usage and industrial inspection. A flying agent has to detect the target, localize it, and determine the motion of chasing. However, some challenges may occur, such as occlusion from obstacles, motion blur, or color ambiguity with the background.

Image credit: pixel2013 via Pixabay (free Pixabay licence)

In a recent paper, a suggestion of the latter problem is offered. This is important in cases where detected targets need to be classified and improves the aesthetic of the film. In the paper, a detectability score metric is proposed. Using this information, a trajectory for chasing a dynamic object is generated.

In order to validate the algorithm, a walking actor with white clothes was filmed between piles of snow. The distance traveled was longer than with a plain chasing strategy, but the actor was maintained in front of the brick walls avoiding snow backgrounds, therefore the detectability of a target was higher.

This work investigates an efficient trajectory generation for chasing a dynamic target, which incorporates the detectability objective. The proposed method actively guides the motion of a cinematographer drone so that the color of a target is well-distinguished against the colors of the background in the view of the drone. For the objective, we define a measure of color detectability given a chasing path. After computing a discrete path optimized for the metric, we generate a dynamically feasible trajectory. The whole pipeline can be updated on-the-fly to respond to the motion of the target. For the efficient discrete path generation, we construct a directed acyclic graph (DAG) for which a topological sorting can be determined analytically without the depth-first search. The smooth path is obtained in quadratic programming (QP) framework. We validate the enhanced performance of state-of-the-art object detection and tracking algorithms when the camera drone executes the trajectory obtained from the proposed method.


Leave a Reply

Your email address will not be published. Required fields are marked *