Scholarly article on topic 'Brain-machine interfaces: an overview'

Brain-machine interfaces: an overview Academic research paper on "Psychology"

0
0
Share paper
Academic journal
Translational Neuroscience
OECD Field of science
Keywords
{""}

Academic research paper on topic "Brain-machine interfaces: an overview"

Review Article • DOI: 10.2478/s13380-014-0212-z • Translational Neuroscience • 5(1) • 2014 • 99-110

Translational Neuroscience

BRAIN-MACHINE INTERFACES:

AN OVERVIEW

Abstract

Brain-machine interfaces (BMIs) hold promise to treat neurological disabilities by linking intact brain circuitry to assistive devices, such as limb prostheses, wheelchairs, artificial sensors, and computers. BMIs have experienced very rapid development in recent years, facilitated by advances in neural recordings, computer technologies and robots. BMIs are commonly classified into three types: sensory, motor and bidirectional, which subserve motor, sensory and sensorimotor functions, respectively. Additionally, cognitive BMIs have emerged in the domain of higher brain functions. BMIs are also classified as noninvasive or invasive according to the degree of their interference with the biological tissue. Although noninvasive BMIs are safe and easy to implement, their information bandwidth is limited. Invasive BMIs hold promise to improve the bandwidth by utilizing multichannel recordings from ensembles of brain neurons. BMIs have a broad range of clinical goals, as well as the goal to enhance normal brain functions.

Keywords

• Brain • Decoding • Interface • Microstimulation • Monkey • Multielectrode • Neuroprosthetic • Paralysis • Plasticity • Robot

© Versita Sp. z o.o.

Mikhail Lebedev

Duke University, Durham, North Carolina, U.S.A.

Received 15 March 2014 accepted 15 March 2014

When neural function goes wrong

All forms of mental activity have to be eventually expressed as muscle contractions and relaxations for us to be able to interact with the external world and to communicate with each other. Muscles control movements of the limbs and eyes, facial expression and speech production. Muscle contractions are also essential for many sensory functions, such as active tactile exploration. As our body parts move, their displacements are monitored by numerous sensory receptors. Ongoing streams of sensory and motor signals are processed by highly distributed brain networks. Much of this immense neural processing escape our conscious awareness, and we take for granted our ability to effortlessly perform very complex tasks, such as dexterous hand movements, speech, bipedal walking, balance control and many others.

Unfortunately there are cases when neural function goes wrong. Millions of people worldwide suffer from motor and sensory

* E-mail: lebedev@neuro.duke.edu

deficits caused by neurological conditions, such as spinal cord injury (SCI), stroke, Parkinson's disease and amyotrophic lateral sclerosis (ALS). Strikingly, many higher brain functions remain intact even in severe neurological cases where patient is unable to move, speak and feel.

There is no efficient treatment for many motor and sensory disabilities. Patients remain bound to their beds or wheelchairs for the rest of their life. Clearly, the development of efficient treatments for these people is one of the most important goals for the medical science today.

Artificial parts for the brain

BMIs hold promise to revolutionize treatment of severe neurological conditions by establishing direct functional connections between intact brain areas and assistive devices that restore motor and sensory functions [1-5] (Figure 1). For example, patients paralyzed by SCI could regain mobility owing to a motor BMI that connects neural circuits of the motor cortex to a robotic limb or an exoskeleton. Furthermore, somatosensory sensation could be restored to

such patients using a sensory BMI that connects somatosensory cortex to prosthetic sensors of touch and position.

In addition to medical applications, BMIs could aid healthy people in certain tasks. Examples include BMIs for computer gaming [7] and neurofeedback systems that alarm long-distance after signs of drowsiness are detected in encephalograms (EEGs) [8].

In addition to the term "BMI", systems that interface the brain to artificial devices are called neural prostheses, brain-computer interfaces (BCIs), neural interfaces, mind-machine interfaces and brain implants. Although some of these names may be preferred to the others in concrete cases, this terminology is commonly used interchangeably. The most futuristic authors would prefer the names "cyborg" and "avatar" to describe biological organisms with artificial parts or even entirely artificial bodies connected to the brain.

This review focuses on BMI systems that operate in motor and sensory domains. (BMIs for cognitive functions are skipped here.) Accordingly, three BMI types are considered:

Figure 1. Brain-machine interface (BMI) for reaching and grasping by a robotic arm. Extracellular activity of cortical neurons was sampled with multielectrode arrays implanted in multiple cortical areas of a rhesus monkey. These neuronal signals were decoded and redirected to the controller of a robotic arm equipped with a gripper. The position of the robotic arm was displayed to the monkey as a computer cursor. The cursor size conveyed the information about the gripping force. The behavioral task required reaching and grasping virtual targets. In a manual version of the task, the monkey operated the robot by moving a two-dimensional joystick. The gripping force was produced when the monkeys squeezed the joystick handle. To enact direct brain control, the joystick was disconnected from the robot, and the robot was operated by the motor commands extracted from the cortical activity. Adapted from [6].

1) motor, 2) sensory, and 3) bidirectional (sensorimotor). This classification resembles a simplified model of the nervous system that labels areas as "sensory" or "motor". In reality, very few brain areas can be considered purely motor or purely sensory. Sensory and motor functions are intermixed in the brain, so the term "sensorimotor" is more appropriate for most of the areas [9,10]. Recently developed bidirectional BMIs can be viewed as a model of such conjoint sensorimotor processing [11].

The development of BMIs has been nothing short of spectacular in recent years. We have seen an exponential growth in BMI publications. Many of the published results were entertained exclusively in science fiction not so long ago, and nobody expected that they would become real so soon. Notwithstanding the success of proof of concept BMI demonstrations, a large number of these projects are at the stage of laboratory testing, and years of research will be required for them to translate to clinical arena and consumer applications. One

notable exception is the cochlear implant, which has been extremely successful [12,13].

The prospect of BMIs intruding into the content of our minds, reading out thoughts and altering neural processing brings about ethical issues [14,15]. For instance, how to prevent unwanted interactions of BMIs with the representation of self and free will? Such questions are particularly pertinent to BMIs that operate in cognitive domain, such as BMIs that decode decisions [16] and handle memories [17]. Although many of ethical issues related to BMIs seem far-fetched today, these questions will certainly become much more important in the future, when the development of artificial parts for the brain accelerates.

Brief history of BMI research

Multielectrode brain implants date back to the experiments conducted by Lilly in the 1950s [9]. He implanted several hundreds of electrodes in monkey cortical areas and used these implants to apply electrical stimulation to different

cortical locations. Lilly observed that electrical stimulation of both motor and somatosensory areas evoked movements of body parts.

In the 1960s and 1970s, scientists started to experiment with neurophysiological recordings as a method to provide subjects with a biofeedback of their own brain activity. Nowlis, Kamiya, Black and Sterman gave their subjects (animal and humans) a biofeedback of their EEG rhythms [1]. The subjects learned to control cortical rhythms voluntarily.

In 1963, Walter conducted an experiment that can be considered the first demonstrations of a BMI. He recorded readiness potentials in the motor cortex of patients undergoing neural surgery [18]. The patients were instructed to push a button in order to advance a slide projector. Readiness potentials occurred shortly before the button presses and were prominent enough to serve as a trigger for the projector. Accordingly, Walter connected the motor cortex to the projector directly. The subjects still continued to press the button, but the

readiness potentials often completed the job before the subjects moved.

A few years later, scientists at the National Institutes of Health (NIH) announced that the development of neurally controlled devices was their research goal. Group leader Frank wrote, "We will be engaged in the development of principles and techniques by which information from the nervous system can be used to control external devices such as prosthetic devices, communications equipment, teleoperators... and ultimately perhaps even computers" [19]. To accomplish this goal, the scientists implanted several electrodes in monkey motor cortex and recorded from a small population of neurons [20]. Their monkeys performed a wrist movement task. In an offline analysis of these recordings, multiple linear regression analysis was applied to reconstruct the movement traces from the neuronal activity with good accuracy. Next, the reconstruction algorithm was implemented online, and NIH monkeys were able to move a cursor on a LED display with their cortical modulations [21].

At about the same time, Fetz and his colleagues conducted pioneering studies on volitional control of single cortical neurons. Monkeys were operantly conditioned to voluntarily modulate firing rates of single cortical neurons [22]. Such control was possible even in the absence of overt limb movements.

In parallel to these studies on motor BMIs, research has started on sensory BMIs. Cochlear implant was the most notable development [13]. In 1957, Djourno and Eyries evoked auditory sensations in deaf subjects using a singlechannel implant that electrically stimulated the auditory nerve. In 1964, Simmons introduced the first multichannel cochlear stimulator, and in 1970s House and Urban developed the first FDA approved device.

Research has also started on sensory BMIs for restoration of vision. The groups of Brindley and Dobelle [23,24] applied electrical stimulated to the primary visual cortex in totally blind individuals. Their subjects perceived light spots, called phosphens, and were able to recognize simple patterns composed of such phosphens.

BMI research markedly accelerated in the late 1990s - early 2000s. Nicolelis and Chapin pioneered neural control of robotic limbs. They

demonstrated a BMI that converted cortical and thalamic activity recorded in awake, behaving rats into the movements of a one-dimensional robot [25].

Nicolelis then started a research program in nonhuman primates. This work resulted in a number of key demonstrations, such as robotic arms controlled by monkey cortical ensembles [6,26,27], BMIs with artificial tactile feedback loops [11], BMIs for decoding locomotion patterns [28], and BMIs for bimanual movements [29].

Kennedy and his colleagues developed a neurotrophic electrode that induces growth of myelinated fibers into the recording tip. An ALS patient implanted with this electrode was able to perform neural on/off control [30].

The group led by Donoghue conducted a series of BMI studies in monkeys and humans. After being implanted with invasive multielectrode arrays in the motor cortex, human subjects attained BMI control of a computer cursor [31] and a robotic manipulator

Schwartz and his colleagues employed intracortical recordings in monkeys to enable three dimensional (3D) BMI control of a cursor

[33] and a robotic arm [34]. Next, they involved human subjects and achieved control of an antropomorphic robotic arm through an invasive BMI [35].

Important contributions came from the groups of Andersen, Shenoy and Vaadia who explored different cortical areas as sources of BMI signals. They also introduced a number of novel decoding algorithms.

In parallel to the studies on invasive BMIs, many groups worked on noninvasive BMIs systems. This research produced a number of useful practical applications, such as EEG-based spellers and wheelchair controllers. Considerable contributions to noninvasive BMIs have been made by the laboratories of Birbaumer, Pfurtscheller, Walpaw, Müller, Schalk, Neuper, Kübler, Millan and others.

Currently many laboratories around the world work on different aspects of BMIs. The BMI field is expanding. Additionally, commercialization of BMI systems has started, with several for companies already developing and distributing medical and consumer BMIs.

Neuronal tuning and neural decoding

Motor BMIs extract motor parameters from the firing of brain neurons. How is this possible? Numerous neurophysiological studies have shown that firing rates of individual neurons are correlated with behavioral parameters. For example, the firing of motor cortical neurons is correlated with the arm position and velocity and joint torque. BMI developers utilize such correlations to decode behavioral parameters from neural signals.

Repeatability of neuronal patterns is the key factor that enables accurate BMI decoding. Consistent and recognizable patterns of neuronal activity that match specific behaviors are referred to as neuronal tuning. It should be noted that neurons are attuned to behavioral parameters in a noisy way. There is a considerable randomness in the firing of each neuron (or what appears to us as randomness, but may represent some kind of brain signalling). The random component of neuronal activity makes the task of decoding more difficult.

Important studies of neuronal tuning were pioneered in the late 1950s - 1960s by Mouncastle in the somatosensory system [36], by Hubel and Wiesel in the visual system [37], and by Evarts in the motor system [10]. These founding fathers of neurophysiology employed sharp-tip electrodes to record extracellular activity of single neurons in various brain areas. They observed that single neurons produced fairly consistent patterns of discharges that represented a variety of sensory and motor parameters.

The methodology of single-unit recordings was employed in many subsequent studies. Wise and his colleagues observed that neurons in premotor cortex were attuned to movement direction several seconds before movements started [38]. In these experiments monkeys were instructed of movement direction, but had to withhold movements until a trigger stimulus occurred. Kalaska and his coworkers employed single-unit recordings and similar instructed-delay tasks to investigate the transformation of visual stimuli into the direction of voluntary movements [39].

Georgopoulos and his colleagues examined discharge patterns of single motor cortical neurons during arm reaching movements in different directions [40]. They found that neuronal rates changed gradually with changes in movement direction. Such broad tuning to direction was well approximated by a cosine function. To explain the representation of movement direction by neuronal populations, Georgopoulos introduced the concept of population vector, which was composed of the contributions from individual neurons. The population vector was found to match not only the ongoing movements, but also mental transformations, for example a 90° mental rotation from the spatial location of the instruction stimulus to the location of the movement target [41].

Overall, these neurophysiological studies have shown that firing rates of single neurons carry information about behavioral parameters. As such, single neurons can be used for BMI decoding. The following example illustrates how decoding works. Neurophysiologists often use a speaker to monitor the discharges single neurons. An experienced neurophysiologist can tell what his monkey is doing by just listening to the speaker. For example, when the monkey reaches in the neuron's preferred direction, the sound gets louder, and the sound weakens when the monkey reaches in the opposite direction. A BMI decoder also "listens" to neuronal firing and tries to guess what kind of behavior or intention the "sound" represents. Importantly, the decoding is more accurate when the decoder "listens" to a "symphony" played by a large number of neurons simultaneously.

Extraction of Information from neuronal ensembles

The larger the neuronal ensemble recorded, the better the accuracy of BMI decoding. The decoding improves because adding neurons increases the amount of information, and because noisy fluctuations in the firing of individual neurons cancel each other when they are grouped together for decoding [1,2].

This is not to say that small neuronal samples are useless for BMIs. In certain cases,

a few neurons may be sufficient to operate a BMI [33,42]. There exist neurons with highly specialized response properties. Such neurons are often called grandmother neurons or Jennifer Aniston neurons because they are highly selective for one stimulus, such as an image of a grandmother or Jennifer Aniston

[43]. If the purpose of a BMI is to detect the presence of Jennifer Aniston, then recording from such a neuron can be handy.

Although the idea of having a specialized neuron for each parameter of interest may seem appealing, this is generally not the way the brain represents information. Neural representation of information is highly distributed, with individual neurons encoding multiple behavioral parameters and large populations of neurons providing the most accurate representations of those parameters

[44]. This is why large neuronal ensembles and large-scale recordings are preferable for BMI control. In particular, recordings from large ensembles are critical when multiple behavioral variables have to be decoded simultaneously [28]. Additionally, ensemble recordings assure stability of BMI control [1]. As to Jennifer Aniston neurons, they are more likely to be discovered when there is a large neuronal sample to select from.

Decoding algorithms

BMI decoders employ statistical and machine-learning methods to optimally extract behavioral parameters from neural activity. Initial settings of a decoder are computed using a sample dataset, also called training dataset. For example, in monkey experiments, training data is often derived from 5 to 10 minutes of performance during which monkeys execute their motor task manually, for example using a joystick [6,11,27]. For this interval, the decoder is trained to extract motor parameters (position, velocity, force) from neuronal activity. The mode of operation is then switched to brain control during which the task performance (e.g., reaching targets with a computer cursor) is controlled by the decoder instead of the monkey hand.

In addition to manual tasks, training data can be obtained by having a subject (monkey or human) passively observe movements

performed by a prosthetic limb, or by asking the subject (human only) to imagine limb movements without producing them. These approaches to decoder training are particularly important for paralyzed patients who are unable to execute manual tasks.

Many BMI decoding algorithms have been proposed during the last several decades. The choice of algorithm in each concrete case is dictated by the behavioral parameters of interest, characteristics of neural signal (single-unit recordings, field potentials, etc.), number of recording channels, and the requirements of the behavioral task (e.g., continuous control of cursor position versus making discrete choices).

Population vector has been an influential approach to neural decoding [41]. In this method, training dataset is derived from a center-out task during which the subject performs arm movements from a central location to different targets. The direction of movement for which a neuron exhibits the highest firing rate is called preferred direction. Decoding of movement direction is performed by calculating the population vector, which is a vector sum of individual-neuron vectors pointing in each neuron's preferred direction and weighted by the firing rates of the corresponding neurons. Notwithstanding many benefits of the population vector approach, this method is suboptimal because it does not include procedures that would minimize the decoding errors.

The Wiener filter works in a very similar way to the population vector, but has an improved accuracy because it minimizes mean-square error [45]. The output of the Wiener filter for time t represents a weighted sum of neuronal rates measured at several points (lags or taps) in the past. In a typical BMI implementation, 5-10 lags span 1 s interval preceding time t. The weights are computed for each neuron and each lag using matrix transforms of the training data.

The Kalman filter works better than the Wiener filter for many cases, particularly for behaviors that consist of several stereotyped patterns. The Kalman filter segregates the variables into state variables (e.g., limb position and velocity) and observable variables (neuronal firing rates). During decoding, states

are updated for discrete time steps (typically 50-100 ms). The update consists of two computations. First, the next state is estimated from the current state. Second, this estimation is corrected using the values of neuronal rates. This latter correction utilizes a model of neuronal tuning that describes the relationship between the states and neuronal patterns.

The unscented Kalman filter (UKF) offers several improvements to the classical Kalman filter [46]. It adds nonliniarity to the neuronal tuning model to better describe the relationship between neuronal modulations and movements, and also adds a time history of neuronal rates.

Artificial neural networks (ANNs) have been utilized as BMI decoders with good results [25,26]. Several ANN types are applicable to BMIs, for example Gaussian classifier, multilayer perceptron, Bayesian logistic regression network, adaptive logic network and learning vector quantization network. Shenoy and his colleagues proposed a dynamical ANN, called recurrent neural networks (RNN), where neuronal activity is treated as a function of its history [47].

Discrete classifies are of interest for BMIs that require discrete choices instead of continues decoding. For example, EEG spellers decode font characters from cortical potentials [48,49]. Several discrete classifiers have been utilized in BMIs: linear discriminant analysis, support vector machine, hidden Markov models, k-nearest neighbors algorithm, and fuzzy logic decoder.

Motor BMIs and theories of motor control

Several theories of motor control were originally developed to explain neuronal mechanism of movements, but they are also relevant for the development of better BMIs.

Motor circuitry of the nervous system consists of many areas interconnected in a hierarchical order. Cortical areas are at the top of this hierarchy. They handle advanced motor functions, for example dexterous hand movements. The brainstem and the spinal cord control simpler, automated motor patterns, such as spinal reflexes [50] and rhythmic

movements generated by central pattern generators (CPGs) [51].

The concept of reflex arc [50] has been very influential in the field of motor control. A reflex is a fast, automated motor responses evoked by a sensory stimulus. The circuitry of simple reflexes (e.g., knee-jerk reflex) resides in the spinal cord. By contrast to reflexes, voluntary movements are programmed centrally in the brain, principally in the cortex. They are organized around a goal. Typical motor activities includes both voluntary and reflex components [52]. Accordingly, BMI designs could benefit from enabling both reflex-like controls and voluntary movements. Such BMIs, called shared control BMIs, let the user control higher-order parameters (e.g., start, stop, select the target of movement), while delegating lower-order controls to a robotic controller (e.g., stabilization of a prosthetic arm against external forces).

Many modern theories of motor control build around the idea that the brain uses an internal model of the body to organize motor activities. This concept was first introduced by Head and Holmes who proposed the term "body schema" to explain how the brain keeps track of the body configuration and continuously updates the representation of the body based on sensory inputs [53]. The concept of the body schema is highly relevant to modern BMI research. Indeed, BMI developers strive to build neurally controlled limbs that gradually become accepted by the brain as natural appendages of the body and eventually become incorporated in the brain representation of the body [1].

Stemming from the ideas of Head and Holmes, the internal model theory [54] considers two components of the motor system: the plant (e.g., an arm with its joints and muscles) and the controller (neural circuitry that controls the arm). The controller builds a mechanical model of the plant, which is used to program future motor states. These programmed motor states are compared with the sensory feedback that the controller receives from the plant. If a discrepancy is detected, the controller issues a correction command. The equilibrium point hypothesis [55] describes one possible implementation of the controller. According to this hypothesis, higher-order motor regions

set a new equilibrium point for the plant, after which a spinal servo mechanism brings the plant into the equilibrium point.

BMIs that enable arm movements

Reaching and grasping movements are vital for our everyday activity. Unsurprisingly, many BMI developers primarily focused on this type of movement.

Figure 1 shows the main components of a BMI for reaching and grasping implemented in rhesus monkeys [6,27]. In this study, monkeys received multielectrode implants in several cortical areas that are known to be involved in the control of arm movements: primary motor cortex, primary somatosensory cortex, premotor cortex, supplementary motor area and posterior parietal cortex. The animals learned to use their cortical modulations to control reaching and grasping movements performed by a robotic arm. The monkeys faced a computer screen where the robot position was represented by a circular cursor, the cursor diameter indicated the gripping force, and targets were represented by circles of variable size. The behavioral task required grasping virtual targets that appeared at different spatial locations. The monkeys first learned to move the robot with a joystick. They displaced the joystick to change the robot position and squeezed the joystick handle to exert a gripping force. Multiple Wiener filters [45] decoded the parameters of robot movements and gripping force from cortical modulations. During brain control, the robot was driven by the outputs of the filters. Two modes of brain control were tested. During brain control with hand movements, the monkeys continued to manipulate the joystick. During brain control without hand movements, the joystick was removed from the setup.

Schwartz and his colleagues enacted BMI control of reaching movements in 3D. Their monkeys viewed a virtual-reality cursor through stereoscopic goggles [33]. The cursor was initially positioned in the center of the workspace. On each trial, a spherical target appeared at a peripheral 3D location, after which the monkey had to reach to it with the

cursor. During manual control, the monkeys positioned the cursor by waving their arm in the air. During brain control, the cursor was controlled by a population vector decoder. The performance markedly improved after a co-adaptive algorithm was introduced that minimized the deviation of the BMI-generated trajectories from the ideal trajectories connecting the center and the target. Several years later, the same group demonstrated a BMI where monkeys fed themselves with a multi-jointed robotic arm controlled by motor cortical activity [34].

Following these demonstrations in monkeys, the laboratories of Donoghue [32] and Schwartz [35] implanted paralyzed patients with invasive cortical arrays and implemented BMIs that control reaching and grasping performed by sophisticated robotic arms.

In the meanwhile the group of Nicolelis developed a BMI that controlled two virtual arms simultaneously [29]. Several hundreds of neurons of neurons were sampled simultaneously in multiple cortical areas. Training data was provided by sessions of passive observations. During brain control, two virtual arms were driven by an UKF that represented both arms conjointly. Marked improvements in performance were observed with training. These improvements were accompanied by changes in correlation between neurons, which decreased as the monkeys perfected their performance.

Functional electrical stimulation

An alternative to using robotic arms and exoskeletons, is to connect cortical output to the subjects' own muscles, and use functional electrical stimulation (FES) as the method to produce movements. Several BMIs of this kind have been reported.

Pfurtscheller and his colleagues attached a FES device to the forearm of a tetraplegic patient [56]. The BMI control was driven by EEG beta rhythms which the subject produced by imagining his foot move. The subjects learned to grasp objects using this BMI.

Fetz and his colleagues demonstrated a similar BMI control using invasive recordings from monkey motor cortex [42]. After a

monkey's hand was temporarily paralyzed with an anesthetic, activity of motor cortical neurons was converted into FES patterns that evoked torques around the wrist joint.

The group led by Miller developed a more advanced cortical control of an FES device [57,58]. Monkey hands were paralyzed with an application of an anesthetic to the median and ulnar nerves at the elbow. Approximately one hundred neurons were recorded in the motor cortex. Their activity drove FES applied to several forearm muscles. Aided with this BMI, the monkeys learned to grasp objects.

BMI for bipedal walking

For many years, BMI research focused on the upper limbs, while the functionality of the lower limbs was practically neglected. Yet, lower limb paralysis presents a significant problem. Millions of people worldwide suffer from this type of deficit caused by SCI and neurodegenerative diseases. Additionally, there is a sizeable population of lower limb amputees who may benefit from a leg prosthesis controlled through a BMI.

The researchers at the Nicolelis laboratory were the first to test the possibility that kinematics of bipedal locomotion can be extracted from the primate cortex [28]. Figure 2 illustrates the settings of this experiment. Rhesus monkeys were trained to walk bipedally on a treadmill. While the monkeys performed this task, neuronal ensemble activity was recorded from cortical sensorimotor areas representing the lower limbs. Monkey leg movements were tracked with video cameras. A BMI decoder was set to extract leg kinematics from the cortical modulations. The decoding worked well for both forward and backward walking.

Inspired by these results, Nicolelis and his colleagues founded the Walk Again Project, an international consortium for the development of the first cortically driven exoskeleton [2]. Nicolelis expects that such exoskeleton will restore mobility to patients suffering from various degrees of leg paralysis. A similar endeavor, called Mindwalker, started in Europe [59]. Additionally, Contreras Vidal and his colleagues proposed to drive a leg exoskeleton

with EEGs. In support of this idea, they decoded gait kinematics from the EEGs recorded in human subjects walking on a treadmill [60].

In addition to cortically controlled BMIs for restoration of walking, there are alternative strategies based on the idea that locomotion can be induced by reactivation of the spinal CPG. Such reactivation was implemented in rats [61]. Locomotion was triggered in rats with spinal cord transections using a combination of epidural electrical stimulation with pharmacological effects of serotonergic agonists.

Perhaps in the future, a hybrid, cortico-spinal BMI will be tested with voluntary aspects of locomotion controlled by cortical activity and automated patterns generated by the spinal CPG.

Neuronal plasticity associated with BMI usage

Many studies have provided evidence of brain plasticity associated with learning to control a BMI. It has been suggested that such plasticity could eventually assimilate prosthetic limbs in the brain representation of the body [1,44].

BMI control of external devices has much in common with using a tool. In a famous experiment on cortical plasticity associated with tool usage, Iriki and his colleagues trained monkeys to reach toward distant objects with a rake [62]. After the monkeys practiced with this tool, neurons in posterior parietal cortex acquired visual receptive fields extending along the length of the rake.

Long-term usage of BMIs could produce similar brain remapping, and several papers have already provided evidence for such plasticity. Neurons engaged in BMI control exhibit stronger modulations [63], changes in correlation with each other [6,29], and changes in directional tuning [27]. Additionally, cortical neurons have been shown to adapt to rotational transformation applied to a subset of neurons engaged in BMI control [64].

As a note of caution, quantification of neuronal adaptations to BMI control is not an easy problem because tuning of a single neuron to motor parameters depends on the BMI decoder parameters and activity patterns

Figure 2. Extraction of locomotion kinematics from cortical ensemble activity. Neuronal ensembles were recorded in the sensorimotor cortex while monkeys walked bipedally on a treadmill. Blue curves represent actual movements tracked with a video tracking system. Red curves represent decoded movements. Adapted from Fitzsim-mons et al. [28].

of both the neuron in question and the other neurons in the population. This analysis is further complicated by the complexities of BMI decoders. For example, linear BMI decoders typically use more than twenty parameters per neuron, making the relationship between the decoder setting and conventional directional tuning curves very complicated. To complicate the situation even further, improvements in BMI-generated trajectories with training have

an effect by themselves on the estimates of directional tuning.

Noninvasive BMIs

Safety is an important consideration when choosing a BMI system. The safest, nonivasive BMIs, utilize sensors that sample neural signals without penetrating into the biological tissue. Many practical noninvasive BMIs have been

developed, such as BMIs for communication, prosthetic control and wheelchair navigation [65-68]. Remarkably, severely impaired 'locked in' patients were able to communicate with the outside world using EEG based spelling devices [48,49].

EEG is the most common noninvasive method. EEG-based BMIs are categorized into two classes: independent (endogenous) and dependent (exogenous). In

an independent BMI, subjects voluntary modulate their brain potentials, for example by imagining their hands move. Several EEG rhythms are useful for this purpose: slow cortical potentials, mu (8-12 Hz), beta (18-30 Hz) and gamma (30-70 Hz) waves [4]. Adaptive decoding algorithms have been shown to improve accuracy of independent BMIs [69].

A dependent BMI requires an external source of sensory stimulation. Typically, stimuli appear on a computer screen. Such a BMI reads out subjects' intentions from the differences in brain response to attended versus unattended stimuli. For example, a BMI based on visual evoked potentials (VEPs) detects stronger VEPs over the visual cortex when subjects look at a corresponding stimulus and attend to it. BMIs based on steady-state visual evoked potentials (SSVEPs) employ for this purpose visual responses to rapid periodic stimuli [70]. Several objects appear on the screen, each flickering at its own frequency. Subjects make selections by looking at one of the objects. P300 evoked potentials are employed in a similar way [71].

EEG-based BMIs have achieved many important milestones, such as control of a robot [72], wheelchair navigation [65], and control of hand orthosis [56,73].

A note of caution should be made about EEG artifacts. A survey of papers on EEG-based BMIs has revealed that EEG artifacts were not handled adequately in many studies [74]. This is troublesome for BMI applications because artifacts could be confused with neural signals and could be even used as informative control signals. Dependent BMIs are better in this respect because they are less sensitive to artifacts compared to independent BMIs.

Electrocorticographic (ECoG) BMIs work similarly to the ones based on EEGs while offering a neural signal of better quality. This invasive method has better temporal and spatial resolution compared to EEGs, and is less sensitive to artifacts.

In addition to electrical potentials, magnetoencephalography (MEG) has been utilized in BMIs [75]. Weak magnetic fields generated by the brain are detected by very sensitive magnetometers, superconducting

quantum interference devices (SQUIDs). The measurements are conducted in a magnetically shielded facility. MEG offers has better spatial and temporal resolution compared to EEG, but requires specialized recording settings.

Near infrared spectroscopy (NIRS) employs an infrared light that penetrates through the skull to detect changes in oxyhemoglobin and deoxyhemoglobin concentrations in the brain blood supply. NIRS-based BMIs have gained popularity [76]. This method allows measuring metabolic activity of the cortex with a spatial resolution of approximately 1 cm and temporal resolution on the order of 100 ms. The drawback is a considerable response delay of several seconds.

Functional magnetic resonance Imaging (fMRI) is a powerful method for detecting hemodynamic responses in the brain. fMRI-based BMIs have been implemented [76]. Their temporal resolution is 1-2 s, and the response lag is several seconds. The spatial resolution is excellent throughout the entire brain, which makes this method quite unique compared to other noninvasive approaches.

Sensory BMIs

Sensory BMIs strive to repair neural circuitry that handles sensory information. Here, many sensory modalities could be approached with BMI methods: hearing, sight, touch, smell, taste, vestibular sensation, and proprioception.

Sensory systems are comprised of hierarchically organized areas that process information originating from peripheral receptors. The signals generated by sensory receptors are transmitted to the spinal cord and brainstem. From there, they flow to the higher-order areas: thalamus, cerebellum, cortex, basal ganglia. Neural representation of sensory information in cortical areas is often called a map or a homunculus - a little person with large hands and face, a small trunk and tiny legs [77].

Sensory impairments can range from a complete loss of sensation after damage to sensory periphery (e.g., blindness, deafness) to deficits that do not remove sensations completely, but rather affect higher-order components of sensory processing. For

example, patients with lesions to visual cortex may have blindsight [78]. They do not consciously perceive visual stimuli (cannot see), but can still utilize visual information subconsciously.

Current sensory BMIs don't go as far as trying to handle higher-order effects (and how can one repair the blindsight?). Their major focus here is neurological damage to peripheral sensors. Sensory BMI attempt to substitute damaged physiological sensors by artificial sensors. The latter are interfaced to intact sensory areas, usually using electrical stimulation as the method to evoke sensations [11,79,80]. Recently, optogenetic stimulation methods have been gaining popularity [81].

Sensory BMI approach is somewhat similar to the method called sensory substitution. In sensory substitution, an impared sensory channel is substituted by an intact physiological sensation, for example sensation from a different body part or a different sensory modality. In this manner, surrogate vision can be produced by a video camera connected to skin tactile display [82]. Note that sensory substitution relies on intact physiological receptors, whereas sensory BMIs employ artificial methods to evoke sensations. In some cases there is no clear-cut distinction between these approaches, such as in case of vision generated by electrical stimulation of the tongue [83,84].

Auditory prosthesis

Cochlear implant is the most successful sensory BMI developed to date [12,13]. Subjects who received these implants can recognize speech, distinguish between male and female voices and even perceive melodies. Bilateral cochlear implants can restore directional hearing. The implant consists of six parts: (1) an external microphone; (2) a speech processor that transforms the microphone signal into the stimulation pattern; (3) a transmitter that transmits the stimulation pattern through the skin; (4) a receiver/stimulator attached to the bone under the skin; (5) a cable running from the stimulator to the electrodes, and (6) an array of stimulating electrodes implanted in the cochlea.

The implant applies patterned electrical stimulation to the parts of the auditory nerve which remain undamaged. The stimulating array serves to activate different parts of the nerve with different patterns. This stimulation strategy utilizes from 4 to 22 channels. Several types of stimulation patterns have been proposed. In the continuous interleaved sampling strategy, the microphone signal is split into frequency bands. A nonlinear function is then applied to convert a wide range of sound intensities into a narrow range of stimuli. Several other processing schemes have been implemented based on algorithms that continuously analyze the microphone signal and select the electrode for each frame of stimulation.

A brainstem implant has been developed for patients with severe damage to cochlea [12]. This device stimulates the cochlear nucleus through surface or penetrating electrodes. Several patients with this implants had low quality of sound recognition; for others the performance reached the benchmarks of cochlear implants.

Visual prosthesis

Current visual prostheses can restore simple visual sensations [85]. There are two main classes of such prostheses: 1) retinal, and 2) non-retinal prostheses. Retinal prostheses are intended for eye pathologies that spare parts of the optic nerve. Non-retinal prostheses are utilized when there is a severe damage to the optic nerve, so the stimulation has to be applied to visual areas of the brain.

Depending on the state of retinal degeneration, different types of retinal prostheses have been proposed. In the epiretinal implant, stimulation is applied to the layer of nerve fibers of retinal ganglion cells. The stimulation is produced by an intraocular electrode array (up to 60 stimulation channels). The array transmits images from a video camera. The developers of these prostheses expect that eventually the camera and all other components will be implanted inside the eye. Patients implanted with this prosthesis perceive objects shape, brightness and motion direction.

The subretinal prosthesis applies electrical stimulation to the ganglion cells and bipolar cells. It incorporates an array of several thousand microphotodiodes, which sense visual stimuli and map them to an array of stimulation electrodes. This system is at an early stage of testing.

In the transchoroidal prosthesis, about fifty stimulating electrodes are placed in the suprachoroidal space. The implantation surgery is simple compared to the other methods. Patients perceive phosphens and can discriminate simple images.

Non-retinal prostheses most commonly employ electrical stimulation of the visual cortex. Dobelle restored rudimentary vision in blind patients using an array of 64 surface electrodes placed over the visual cortex [24].

It seems reasonable to expect that better results will be obtained with intracortical microelectrode arrays.

Bidirectional BMIs

Sensorized or bidirectional BMIs simultaneously decode brain activity and deliver artificial sensory feedback to the brain. Figure 3 depicts a pioneering demonstration of a bidirectional by the researchers from the Nicolelis laboratory [86,87]. Monkeys were implanted with microelectrode arrays in the motor and somatosensory cortices. The motor cortex implants served to extract motor intentions. Artificial tactile sensations were produced by intracortical microstimulation (ICMS) delivered through the somatosensory cortex implants.

Figure 3. Schematic of the bidirectional BMI. In the motor loop, cursor position is extracted from motor cortical neurons. In the sensory loop, artificial tactile feedback is delivered to the somatosensory cortex as spatiotemporal patterns of ICMS. Adapted from O'Doherty et al. [86].

The authors called this bidirectional interface brain-machine-brain interface (BMBI). Aided by the BMBI, monkeys actively explored virtual objects with a cursor or a realistic image of the monkey arm (avatar arm). The virtual objects looked identical, but were associated with different artificial textures signalled by ICMS every time the avatar hand touched an object.

Conclusion

In recent years, we have witnessed an explosive development of BMIs. The list of achievements in this research field is steadily expanding. BMI methods have been applied to arm movements and bipedal locomotion, somatosensory sensation, hearing and vision.

Studies suggest that BMI controlled prosthetic limbs may become incorporated in the brain representation of the body. Although many BMI projects are still at the stage of laboratory research, scientists are confident that BMIs will be entering the clinical world and revolutionizing treatment of neurological conditions, which are incurable otherwise.

References

[1] Lebedev M.A., Nicolelis M.A., Brain-machine interfaces: past, present and future, 2006, Trends Neurosci., 29, 536-546

[2] Nicolelis M.A., Lebedev M.A., Principles of neural ensemble physiology underlying the operation of brain-machine interfaces, Nat. Rev. Neurosci., 2009, 10, 530-540

[3] Schwartz A.B., Cui X.T., Weber D.J., Moran D.W., Brain-controlled interfaces: movement restoration with neural prosthetics, Neuron, 2006, 52, 205-220

[4] McFarland D.J., Krusienski DJ., Wolpaw J.R., Brain-computer interface signal processing at the Wadsworth Center: mu and sensorimotor beta rhythms, Prog. Brain Res., 2006, 159, 411-419

[5] Hatsopoulos N.G., Donoghue J.P., The science of neural interface systems, Annu. Rev. Neurosci., 2009, 32, 249-266

[6] Carmena J.M., Lebedev M.A., Crist R.E., O'Doherty J.E., Santucci D.M., Dimitrov D.F., et al., PloS Biol., 2003, 1, E42

[7] Tangermann M., Krauledat M., Grzeska K., Sagebaum M., Blankertz B., Vidaurre C., et al., Playing pinball with non-invasive BCI, Adv. Neural Inf. Process. Syst., 2009, 21, 1641-1648

[8] Lin C.T., Chang C.J., Lin B.S., Hung S.H., Chao C.F., Wang I.J., A real-time wireless brain-computer interface system for drowsiness detection, IEEE Trans. Biomed. Circuits Syst., 2010, 4, 214-222

[9] Lilly J.C., Distribution of 'motor' functions in the cerebral cortex in the conscious, intact monkey, Science, 1956, 124, 937

[10] Evarts E.V., Motor cortex reflexes associated with learned movement, Science, 1973, 179, 501-503

[11] O'Doherty J.E., Lebedev M.A., Ifft P.J., Zhuang K.Z., Shokur S., Bleuler H., et al., Active tactile exploration using a brain-machine-brain interface, Nature, 2011, 479, 228-231

[12] Shannon R.V., Advances in auditory prostheses, Curr. Opin. Neurol., 2012, 25, 61-66

[13] Wilson B.S., Dorman M.F., Cochlear implants: a remarkable past and a brilliant future, Hear. Res., 2008, 242, 3-21

[14] Farah M.J., Emerging ethical issues in neuroscience, Nat. Neurosci., 2002, 5, 1123-1129

[15] Vlek R.J., Steines D., Szibbo D., Kubler A., Schneider M.J., Haselager P., et al., Ethical issues in brain-computer interface research, development, and dissemination, J. Neurol. Phys. Ther., 2012, 36, 9499

[16] Andersen R.A., Hwang E.J., Mulliken G.H., Cognitive neural prosthetics. Annu. Rev. Psychol., 2010, 61, 169-190

[17] Berger T.W., Ahuja A., Courellis S.H., Deadwyler S.A., Erinjippurath G., Gerhardt G.A., et al., IEEE Eng. Med. Biol. Mag., 2005, 24, 30-44

[18] Dennett D.C., Consciousness explained, Back Bay Books, New York, NY, USA, 1992 [This book contains a description of the pioneering demonstration of a brain-machine interface by Grey Walter]

[19] Frank K., Some approaches to the technical problem of chronic excitation of peripheral nerve, Ann. Otol. Rhinol. Laryngol., 1968, 77, 761-771

[20] Humphrey D.R., Schmidt E.M., Thompson W.D., Predicting measures of motor performance from multiple cortical spike trains, Science, 1970, 170, 758-762

[21] Schmidt E.M., Single neuron recording from motor cortex as a possible source of signals for control of external devices, Ann. Biomed. Eng., 1980, 8, 339-349

[22] Fetz E.E., Operant conditioning of cortical unit activity, Science, 1969, 163, 955-958

[23] Brindley G.S., Lewin W.S., The sensations produced by electrical stimulation of the visual cortex, J. Physiol., 1968, 196, 479-493

[24] Dobelle W.H., Mladejovsky M.G., Girvin J.P., Artifical vision for the blind: electrical stimulation of visual cortex offers hope for a functional prosthesis, Science, 1974, 183, 440-444

[25] Chapin J.K., Moxon K.A., Markowitz R.S., Nicolelis M.A., Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex, Nat. Neurosci., 1999, 2, 664-670

[26] Wessberg J., Stambaugh C.R., Kralik J.D., Beck P.D., Laubach M., Chapin J.K., et al., Real-time prediction of hand trajectory by ensembles of cortical neurons in primates, Nature, 2000, 408, 361-365

[27] Lebedev M.A., Carmena J.M., O'Doherty J.E., Zacksenhouse M., Henriquez C.S., Principe J.C., et al., Cortical ensemble adaptation to represent velocity of an artificial actuator controlled by a brain-machine interface, J. Neurosci., 2005, 25, 4681-4693

[28] Fitzsimmons N.A., Lebedev M.A., Peikon I.D., Nicolelis M.A., Extracting kinematic parameters for monkey bipedal walking from cortical neuronal ensemble activity, Front. Integr. Neurosci., 2009, 3, 3

[29] Ifft P.J., Shokur S., Li Z., Lebedev M.A., Nicolelis M.A., A brain-machine interface enables bimanual arm movements in monkeys, Sci. Transl. Med., 2013, 5, 210ra154

[30] Kennedy P.R., Bakay RA., Restoration of neural output from a paralyzed patient by a direct brain connection, Neuroreport, 1998, 9, 1707-1711

[31] Hochberg L.R., Serruya M.D., Friehs G.M., Mukand J.A., Saleh M., Caplan A.H., et al., Neuronal ensemble control of prosthetic devices by a human with tetraplegia, Nature, 2006, 442, 164-171

[32] Hochberg L.R., Bacher D., Jarosiewicz B., Masse N.Y., Simeral J.D., Vogel J., et al., Reach and grasp by people with tetraplegia using a neurally controlled robotic arm, Nature, 2012, 485, 372-375

[33] Taylor D.M., Tillery S.I., Schwartz A.B., Direct cortical control of 3D neuroprosthetic devices, Science, 2002, 296, 1829-1832

[34] Velliste M., Perel S., Spalding M.C., Whitford A.S., Schwartz A.B., Cortical control of a prosthetic arm for self-feeding, Nature, 2008, 453, 1098-1101

[35] Collinger J.L., Wodlinger B., Downey J.E., Wang W., Tyler-Kabara E.C., Weber D.J., et al., High-performance neuroprosthetic control by an individual with tetraplegia, Lancet, 2013, 381, 557-564

[36] Mountcastle V.B., The sensory hand: neural mechanisms of somatic sensation, Harvard University Press, Cambridge, MA, USA, 2005

[37] Hubel D.H., Wiesel T.N., Brain and visual perception: the story of a 25-year collaboration, Oxford University Press, Oxford, UK, 2005

[38] Wise S.P., The primate premotor cortex: past, present, and preparatory, Annu. Rev. Neurosci, 1985, 8, 1-19

[39] Kalaska J.F., Scott S.H., Cisek P., Sergio L.E., Cortical control of reaching movements, Curr. Opin. Neurobiol., 1997, 7, 849-859

[40] Georgopoulos A.P., Kalaska J.F., Caminiti R., Massey J.T., On the relations between the direction of two-dimensional arm movements and cell discharge in primate motor cortex, J. Neurosci., 1982, 2, 1527-1537

[41] Georgopoulos A.P., Lurito J.T., Petrides M., Schwartz A.B., Massey J.T., Mental rotation of the neuronal population vector, Science, 1989, 243, 234-236

[42] Moritz C.T., Perlmutter S.I., Fetz E.E., Direct control of paralysed muscles by cortical neurons, Nature, 2008, 456, 639-642

[43] Quiroga R.Q., Reddy L., Kreiman G., Koch C., Fried I., Invariant visual representation by single neurons in the human brain, Nature, 2005, 435, 1102-1107

[44] Nicolelis M.A., Beyond boundaries: the new neuroscience of connecting brains with machines - and how it will change our lives, Times Books, New York, NY, USA, 2011

[45] Haykin S., Adaptive filter theory (4th Ed.), Prentice Hall, Upper Saddle River, New Jersey, 2001

[46] Li Z., O'Doherty J.E., Hanson T.L., Lebedev M.A., Henriquez C.S., Nicolelis M.A., Unscented Kalman filter for brain-machine interfaces, PLoS One, 2009, 4, e6243

[47] Sussillo D., Nuyujukian P., Fan J.M., Kao J.C., Stavisky S.D., Ryu S., et al., A recurrent neural network for closed-loop intracortical brain-machine interface decoders, J. Neural Eng., 2012, 9, 026027

[48] Birbaumer N., Ghanayim N., Hinterberger T., Iversen I., Kotchoubey B., Kubler A., et al., A spelling device for the paralysed, Nature, 1999, 398, 297-298

[49] Birbaumer N., Murguialday A.R., Cohen L., Brain-computer interface in paralysis, Curr. Opin. Neurol., 2008, 21, 634-638

[50] Sherrington C.S., The integrative action of the nervous system, Charles Scribner's Sons, New York, NY, USA, 1906

[51] Guertin P.A., The mammalian central pattern generator for locomotion, Brain Res. Rev., 2009, 62, 45-56

[52] Cordo P.J., Gurfinkel V.S., Motor coordination can be fully understood only by studying complex movements, Prog. Brain Res., 2004, 143, 29-38

[53] Head H., Holmes G., Sensory disturbances from cerebral lesions, Brain, 1911, 34, 102-254

[54] Kawato M., Internal models for motor control and trajectory planning, Curr. Opin. Neurobiol., 1999, 9, 718-727

[55] Feldman A.G., Ostry D.J., Levin M.F., Gribble P.L., Mitnitski A.B., Recent tests of the equilibrium-point hypothesis (lambda model), Motor Control, 1998, 2, 189-205

[56] Pfurtscheller G., Müller G.R., Pfurtscheller J., Gerner H.J., Rupp R., 'Thought'--control of functional electrical stimulation to restore hand grasp in a patient with tetraplegia, Neurosci. Lett., 2003, 351, 33-36

[57] Ethier C., Oby E.R., Bauman M.J., Miller L.E., Restoration of grasp following paralysis through brain-controlled stimulation of muscles, Nature, 2012, 485, 368-371

[58] Pohlmeyer E.A., Oby E.R., Perreault E.J., Solla S.A., Kilgore K.L., Kirsch R.F., et al., Toward the restoration of hand use to a paralyzed monkey: brain-controlled functional electrical stimulation of forearm muscles, PLoS One, 2009, 4, e5924

[59] Cheron G., Duvinage M., De Saedeleer C., Castermans T., Bengoetxea A., Petieau M., et al., From spinal central pattern generators to cortical network: integrated BCI for walking rehabilitation, Neural Plast., 2012, 375148

[60] Presacco A., Forrester L.W., Contreras-Vidal J.L., Decoding intra-limb and inter-limb kinematics during treadmill walking from scalp electroencephalographic (EEG) signals, IEEE Trans. Neural Syst. Rehabil. Eng., 2012, 20, 212-219

[61] Courtine G., Gerasimenko Y., van den Brand R., Yew A., Musienko P., Zhong H., et al., Transformation of nonfunctional spinal circuits into functional states after the loss of brain input, Nat. Neurosci., 2009, 12, 1333-1342

[62] Iriki A., Tanaka M., Iwamura Y., Coding of modified body schema during tool use by macaque postcentral neurones, Neuroreport, 1996, 7, 2325-2330

[63] Zacksenhouse M., Lebedev M.A., Carmena J.M., O'Doherty J.E., Henriquez C., Nicolelis M.A., Cortical modulations increase in early sessions with brain-machine interface, PLoS One, 2007, 2, e619

[64] Chase S.M., Kass R.E., Schwartz A.B., Behavioral and neural correlates of visuomotor adaptation observed through a brain-computer interface in primary motor cortex, J. Neurophysiol., 2012, 108, 624644

[65] Galán F., Nuttin M., Lew E., Ferrez P.W., Vanacker G., Philips J., et al., A brain-actuated wheelchair: asynchronous and non-invasive brain-computer interfaces for continuous control of robots, Clin. Neurophysiol., 2008, 119, 2159-2169

[66] Muller-Putz G.R., Pfurtscheller G., Control of an electrical prosthesis with an SSVEP-based BCI, IEEE Trans. Biomed. Eng., 2008, 55, 361-364

[67] Nicolas-Alonso L.F., Gomez-Gil J., Brain computer interfaces, a review, Sensors (Basel), 2012, 12, 1211-1279

[68] Sellers E.W., Vaughan T.M., Wolpaw J.R., A brain-computer interface for long-term independent home use, Amyotroph. Lateral Scler., 2010, 11, 449-455

[69] Wolpaw J.R., McFarland D.J., Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans, Proc. Natl. Acad. Sci. USA, 2004, 101, 17849-17854

[70] Vialatte F.B., Maurice M., Dauwels J., Cichocki A., Steady-state visually evoked potentials: focus on essential paradigms and future perspectives, Prog. Neurobiol., 2010, 90, 418-438

[71] Farwell L.A., Donchin E., Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials, Electroencephalogr. Clin. Neurophysiol., 1988, 70, 510-523

[72] Millan J.R., Renkens F., Mourino J., Gerstner W., Noninvasive brain-actuated control of a mobile robot by human EEG, IEEE Trans. Biomed. Eng., 2004, 51, 1026-1033

[73] Tavella M., Leeb R., Rupp R., Millan J. del R., Towards natural non-invasive hand neuroprostheses for daily living, Conf. Proc. IEEE Eng. Med. Biol. Soc., 2010, 126-129

[74] Fatourechi M., Bashashati A., Ward R.K., Birch G.E., EMG and EOG artifacts in brain computer interface systems: a survey, Clin. Neurophysiol., 2007, 118, 480-494

[75] Mellinger J., Schalk G., Braun C., Preissl H., Rosenstiel W., Birbaumer N., et al., An MEG-based brain-computer interface (BCI), Neuroimage, 2007, 36, 581-593

[76] Sitaram R., Caria A., Birbaumer N., Hemodynamic brain-computer interfaces for communication and rehabilitation, Neural Netw., 2009, 22, 1320-1328

[77] Schott G.D., Penfield's homunculus: a note on cerebral cartography, J. Neurol. Neurosurg. Psychiatry, 1993, 56, 329-333

[78] Barton J.J., Disorder of higher visual function, Curr. Opin. Neurol., 2011, 24, 1-5

[79] Romo R., Hernández A., Zainos A., Brody C.D., Lemus L., Sensing without touching: psychophysical performance based on cortical microstimulation, Neuron, 2000, 26, 273-278

[80] Fitzsimmons N.A., Drake W., Hanson T.L., Lebedev M.A., Nicolelis M.A., Primate reaching cued by multichannel spatiotemporal cortical microstimulation, J. Neurosci., 2007, 27, 5593-5602

[81] Zhang F., Aravanis A.M., Adamantidis A., de Lecea L., Deisseroth K., Circuit-breakers: optical technologies for probing neural signals and systems, Nat. Rev. Neurosci., 2007, 8, 577-581

[82] Jones L.A., Tactile communication systems optimizing the display of information, Prog. Brain Res., 2011, 192, 113-128

[83] Sampaio E., Maris S., Bach-y-Rita P., Brain plasticity: 'visual' acuity of blind persons via the tongue, Brain Res., 2001, 908, 204-207

[84] Bach-y-Rita P., Kercel S., Sensory substitution and the human-machine interface, Trends Cogn. Sci., 2003, 7, 541-546

[85] Fernandes R.A., Diniz B., Ribeiro R., Humayun M., Artificial vision through neuronal stimulation, Neurosci. Lett., 2012, 519, 122-128

[86] O'Doherty J.E., Lebedev M.A., Hanson T.L., Fitzsimmons N.A., Nicolelis M.A., A brain-machine interface instructed by direct intracortical microstimulation, Front. Integr. Neurosci., 2009, 3, 20

[87] O'Doherty J.E., Lebedev M.A., Li Z., Nicolelis M.A., Virtual actual touch using randomly patterned intracortical microstimulation, IEEE Trans. Neural Syst. Rehabil. Eng., 2012, 20, 85-93