Introduction: Brain-Controlled Wheelchair Robot

Our Independence Project submission is to expand our previous Brain-Controlled Wheelchair project (1st prize winner of the Humana Health Challenge in coordination with jerkey) into a new design which uses a revolutionary combination of EEG-based P300 detections to automatically maneuver a wheelchair around a home or office plus augmented reality to interact with objects such as those found in a refrigerator, loading a dishwasher, or fetching mail. This is a significant improvement over requiring the user to maintain constant concentration on a single thought while remaining aware and reactive to their surroundings.

In a 2004 survey of 681 individuals afflicted by spinal cord injuries, when asked in what ways their quality of life could be most improved, increased mobility did not top the list. In fact it registered fifth and fourth for quadriplegics and paraplegics respectively. Instead many of the most desired requests revolved around increasing personal independence and restoring control of bodily functions. By incorporating a robot arm and telepresense control we help work towards this goal.

For example, if someone reclining in a bed wants a drink from the kitchen then they may need to enlist a caregiver to help transfer them into the wheelchair, at which point the caregiver may as well have retrieved the drink in the first place. Beyond that if navigating between rooms and around obstacles requires one form or another of constant, careful concentration then the solution is not very convenient to use. Nor should it be required they retrieve items personally.

Our updated design incorporates the latest in computer vision and robotics engineering to provide automated pathfinding and interaction with simple objects using a custom arm. Instead of manual navigation, the user is presented with a floorplan or map of their current location. They can make a single "kitchen" selection using a P300-based menu and the robot will figure out for them how to best travel there. The user can be sitting in the wheelchair at the time or viewing remotely through a live video feed.

Upon reaching the kitchen, an augmented realty overlay is produced allowing recognized objects to be highlighted and selected (again via P300). The user can then issue a second command to retrieve the drink and finally to return to the original room of the house.

The entire sequence can thus be reduced from a constant strain to a handful of simple selections, which makes for a far more pleasant and practical user experience.

This design will also focus on a robot which can be build using common tools and equipment found in hackerspaces such as Noisebridge (operated as a 501c3 non-profit organization) where our group normally meets. If we are to win we will order and donate a new laser cutter and 3D printer for use with this and other projects as well as purchase various materials we need for our first prototype such as servo motors and a laser pathfinder.

Our past experience in this space including the DORA Opensource Robotic Assistant, Brain-Controlled Helicopters, the Brainstorms education initiative (example one and two), and even a Guinness World Record for Brain-Computer Interface.