Contains two human-computer interfaces. The first is an interface for blind people to perceive visual sensations using his tongue. Images from a webcam is processed with artificial intelligence software and is placed as a sensory matrix under the tongue. Currently the sensor placed on the tongue is about 8x8 pixels. The sight and the taste divide similar areas on the cortex so the blind person can adapt very quickly to the image sent on his tongue as an electricity matrix. Taste buds are the second sensor matrix after the eyes(as resolution) is based on the same principle of the Braile code but the information is received by tongue and it's proportional with the image from webcam and the person can receive more informations. The second interface follows the intent of motion detection of a person with disabilities. It is based on processing the neural signal of the brain taken by an handmade encephalograph and processing them with a artificial intelligence on computer. The project contains hardware and software. This project tries to suggest that the human computer interfaces can be made to support people with disabilities.