Predictive control, internal models, and neural circuitry underlying dragonfly interception steering


February 2, 2016 - 12:00pm
Northwest 243
About the Speaker
Anthony Leonardo (Janelia)

Sophisticated sensorimotor behaviors often rely on model-driven control. In vertebrates, such as primates, even simple targeted actions use forward models to predict the sensory consequences of self movement, inverse models to generate the motor commands needed to attain desired sensory states, and physical models to predict target properties. In contrast, whether internal models are used by invertebrates has remained unresolved. Even high-precision behaviors like dragonfly prey capture are thought to be largely reflexive, in which steering is based on responses to prey angular velocity. However, having to perform brief, high acceleration flights accurately without prediction would impose extreme demands on the dragonfly's reaction time and maneuverability. Here we explore the use of predictive control in dragonfly interception steering by developing an experimental system capable of simultaneously tracking the three dimensional orientation of the dragonfly's head and body, along with the flight paths of pursuer and prey. We use this system to demonstrate that the dragonfly's head and body movements are consistent with being driven by forward, inverse, and target models. Movements of the head predict and null rotational and translational motion to hold the prey image fixed on the eye. This encodes the prey's predicted angular position as the angles between the dragonfly's head and body, and the prey's unexpected motion as the residual visual drift of the prey image. These variables are used to position and align the dragonfly's body directly below the prey at an orientation that facilitates capture. The dragonfly's steering strategy thus combines multi-sensory integration with predictive and reactive control to construct an interception trajectory that complies with body constraints. A surprising consequence of this decomposition is that, if correct, the role of vision in this highly visual behavior is only to correct occasional errors rather than for continuous steering control. Behavioral data and a first look at components of the underlying neural circuitry for interception steering will be discussed.