On November 12, the Jerusalem Post broke the news (and reposted on the Digital Journal) that the team of the Advanced Virtuality Lab (AVL) is working on several brain-computer interfaces. The VERE project (Virtual Embodiment and Robotic Re-embodiment) is funded by the European Union. The team recently reported the successful use of a brain scanner to control a computer application interactively in real time. Friedman, the team leader, commented on the potential applications of the recent achievement: “You could control an avatar just by thinking about it and activating the correct areas in the brain.”
BCI research has already provided some (relatively) mainstream applications, as you remember showed up on June 5 and 6, 2010 at a global aerospace convention in Ontario (ref. https://sabinereljic.wordpress.com/2011/05/31/technology-that-can-help-physically-challenged-humans-communicate/)
The Islreali team is also interested in telepresence and what they call intelligent transformation, regarding avatar behaviors and cultural variables. The project BEAMING (Being in Augmented Multi-modal Naturally-networked Gatherings) aims to develop new approaches to producing lifelike interactions using “mediated technologies such as surround video conference, virtual and augmented reality, virtual sense of touch (haptics) and spatialized audio and robotics.”
As seen in the Second Life recording, a BEAMING proxy is a bot that has been programmed to answer questions and also reproduce the characteristic mannerisms and body language of the human it duplicates.