Shaping the Dynamics of a Bidirectional Neural Interface

Abstract
Progress in decoding neural signals has enabled the development of interfaces that translate cortical brain activities into commands for operating robotic arms and other devices. The electrical stimulation of sensory areas provides a means to create artificial sensory information about the state of a device. Taken together, neural activity recording and microstimulation techniques allow us to embed a portion of the central nervous system within a closed-loop system, whose behavior emerges from the combined dynamical properties of its neural and artificial components. In this study we asked if it is possible to concurrently regulate this bidirectional brain-machine interaction so as to shape a desired dynamical behavior of the combined system. To this end, we followed a well-known biological pathway. In vertebrates, the communications between brain and limb mechanics are mediated by the spinal cord, which combines brain instructions with sensory information and organizes coordinated patterns of muscle forces driving the limbs along dynamically stable trajectories. We report the creation and testing of the first neural interface that emulates this sensory-motor interaction. The interface organizes a bidirectional communication between sensory and motor areas of the brain of anaesthetized rats and an external dynamical object with programmable properties. The system includes (a) a motor interface decoding signals from a motor cortical area, and (b) a sensory interface encoding the state of the external object into electrical stimuli to a somatosensory area. The interactions between brain activities and the state of the external object generate a family of trajectories converging upon a selected equilibrium point from arbitrary starting locations. Thus, the bidirectional interface establishes the possibility to specify not only a particular movement trajectory but an entire family of motions, which includes the prescribed reactions to unexpected perturbations. Brain-machine interfaces establish new communication channels between the brain and the external world with the goal of restoring sensory and motor functions for people with severe paralysis or sensory impairments. Current methodologies are based on decoding the motor intent from the recorded neural activity and transforming the extracted information into motor commands to control external devices as robotic arms. We developed a novel computational approach, based on the concept of programming dynamical behaviors trough the bi-directional sensory-motor interaction between the brain and the connected external device. This approach is based on the emulation of some control features of a biological interface, the spinal cord. The first prototype of our interface controls the state of motion of a simulated point mass in a viscous medium. The position of the point mass is encoded into a stimulus to the somatosensory cortex of an anesthetized rat. The evoked activity of a population of motor cortical neurons is decoded into a force vector applied to the point mass. The parameters of the encoder and of the decoder are set to approximate a desired force field. In the first test of the interface, we obtained a family of trajectories that converged upon a stable attractor.