DARPA, Defence Advanced Research Projects Agency, has awarded funding to multiple institutions to work on the Hand Proprioception and Touch Interfaces project (HAPTIX). The aim is to create a prosthetic hand that can be moved by thoughts and can provide a variety of sensation to the brain controlling it.
There are a number of breakthrough technologies involved in achieving this goal.
You need to have motors that are sufficiently light to be used in a prosthetic hand, and you need quite a number of them, since for every degree of movement there should be a "motor" supporting it. And our hand can move in many many ways! To be precise, our hand has 40 muscles (what we may call biologic motors) providing 23 degrees of movements (or freedom, as they are called). The best robotic hand (not prosthetic) created so far to my knowledge has got 19 motors with 19 degrees of freedom. A prosthetic hand needs to be both lighter than a robotic hand and should consume a limited amount of power to be operational for several hours (and not tethered to mains). We are still quite far from having a prosthetic hand satisfying these requirements (23 degrees of freedom, light and low power).
Additionally you need to have sensors and "intelligence" that can tell the motors how much pressure to apply, they need to understand the context and what you are planning to do. Picking up an egg is different, in terms of pressure, from picking up a glass (obviously), but also picking up a glass that is empty is different from picking up a glass that is full. If you want to caress your cat you want a kind of pressure that is different from the one you use in straightening the bed linen and duvet. All of this comes natural to you but it requires a lot of knowledge and computation. We are still far from a general solution to this challenge.
Then you want to feel what your prosthetic hand is doing, hence you need a mechanism of force feedback, as well as the variety of subtle touch sensations (including temperature, wetness, softness,…) that a real hand can provide you. And we are very very far from this today, although several researchers have been working and even succeeded in provide some sort of force feedback.
Finally, and I left it for last because it seems to me the most daunting, you wan to control your prosthetic hand by thought. We have seen amazing progress done in this area but this progress basically points out the difficulties and the gap still existing in controlling a prosthetic hand (or limb) with our mind. Yes, we have seen amazing demonstration of a person pouring water from a bottle into a glass (see clip) but that person had to train significantly and most important has to focus on the task. This is not something we are doing when pouring water into a glass (yes, sometimes that means we spill the water ("why don’t you think what you are doing" is a standard phrase you’d hear my wife telling me…); it just comes natural to us. To pick up your thoughts, scientists have to connect activities going on in your brain motor area with the computer controlling the prosthetic hand. And what happens is that through training you manage to tell the computer what you are at. We haven’t reached the point that a computer can "read" our thoughts and understand our intentions. More than that. How many time you have though, imagined, to do something but actually never do that because it would have been "appropriate" (don’t let me go into details…). A computer reading our mind needs to be able to distinguish a wish that needs to remain such from one that has to be turned into reality by an action. And we are very very far from this.
Hence the great interest for HAPTIX, a program that stimulates breakthrough research in many different fields.