Demos

Visual motion onset BCIs (projects by Jair PEREIRA Junior & Caio TEIXEIRA)

Efferent visual motion onset the display center BCI paradigm was developed by Jair Pereira Junior allowing for single trial-based operation of eight commands' application (here simple 1-8 digit speller).

Afferent visual motion onset the display center BCI paradigm was developed by Caio Teixeira allowing for single trial-based operation of six commands' application (here simple 1-6 digit speller).

Sixteen commands and 40 Hz carrier frequency cVEP BCIs (project by Daiki AMINAKA)

This video is about our latest development of 16 commands and 40 Hz carrier frequency (getting closer to the magic 50 Hz) cVEP BCI by Daiki AMINAKA.

Chinese PinYin BCI speller (project by Zhou JUNJIE)

This video shows a result of undergraduate project by Zhou JUNJIE. Chinese PinYin BCI speller was realized in two step spatial auditory BCI modality utilizing P300 responses.

Videos available in China on bilibili and youku.

Yet another successful example of a direct brain-robot control of two NAO humanoids using two eight-commands' cVEP BCIs (project by Daiki AMINAKA)

This video is about a direct-brain control of two robots with visual cVEP BCIs. The project is a collaboration with Cichocki Lab at RIKEN BSI.

A direct brain-robot control of two NAO humanoids using spatial auditory (Chisaki NAKAIZUMI's project) and vibrotactile (Hiroki YAJIMA's project) BCIs

This video is about a direct-brain control of two robots with auditory and tactile BCIs. The project is a collaboration with Cichocki Lab at RIKEN BSI.

Eight commands cVEP BCI direct brain-robot control by Daiki AMINAKA

This video is about a robot control with eight command cVEP BCI. In the demo video the user controls his robot with the cVEP BCI. The project is a collaboration with Cichocki Lab at RIKEN BSI.

Our attempt of "a direct brain-robot" control of two NAO humanoids using cVEP (Daiki AMINAKA's project) and tactile pin-pressure (Kensuke SHIMIZU's project) BCIs

These videos are about two robots control with cVEP and tactile BCIs. In each demo video the left user controls his robot with tactile BCI (the tactile push pin stimulator under his right palm). The right user controls his robot with the cVEP BCI (see a second frame on the floor with cVEP patterns). The project is a collaboration with Cichocki Lab at RIKEN BSI and definitely more to come soon!!! Stay tuned ;)

Our first attempts to "brain-control" a humanoid robot NAO by Kensuke

These videos present our first attempt to control a humanoid robot NAO using tactile-pressure BCI developed by Kensuke SHIMIZU & Tomek. The BCI paradigm uses still seven oddball sequences averaging, thus still so slow. The click noises are "side effects" of the tactile pressure stimulus generator (this is also to be improved soon). The project is a collaboration with Cichocki Lab at RIKEN BSI. More to come soon!!! Stay tuned ;)

Chromatic SSVEP BCI-based NAO robot control by Daiki

This video is about chromatic SSVEP BCI-based NAO robot control. This is a result of a collaboration between BCI-lab-group at University of Tsukuba (Daiki AMINAKA, Kensuke SHIMIZU & Dr. Tomasz "Tomek" M. RUTKOWSKI) and Cichocki Lab at RIKEN BSI (Peter JURICA and Dr. Andrzej Cichocki). This time we managed to control online NAO robot using chromatic SSVEP BCI developed by Daiki AMINAKA and Tomasz M. RUTKOWSKI.

Tactile-body BCI-based NAO robot control by Takumi

This video is about tactile-body BCI-based NAO robot control. This is a result of a collaboration between BCI-lab-group at University of Tsukuba (Takumi KODAMA, Kensuke SHIMIZU & Dr. Tomasz "Tomek" M. RUTKOWSKI) and Cichocki Lab at RIKEN BSI (Peter JURICA and Dr. Andrzej Cichocki). This time we managed to control online NAO robot using tactile-body BCI developed by Takumi KODAMA, under supervision and advice of Dr. Tomasz M. RUTKOWSKI, using only four trials (sequences) averaging, which matchad the robot's command execution.

Japanese kana character spatial auditory BCI speller - a graduation demo by Moonjeong, MSc'15

This video is about spatial auditory BCI speller developed by Moonjeong CHANG as her graduation project. We present a proof of concept of the Japanese kana (45 characters) speller developed in form of the two-step input spatial auditory BCI.

The autdBCI and a robot control (the winner project of The BCI Annual Research Award 2014)

This video presents autdBCI controlling a small robot. We present results of a study in which contact­less and airborne ultrasonic tactile display (AUTD) stimuli delivered to the palms of a user serve as a platform for a brain computer interface (autdBCI) paradigm. Six palm positions are used to evoke somatosensory brain responses, in order to define a novel contact­less tactile autdBCI. The autdBCI won The BCI Research Award 2014 (http://www.bci-award.com/).

Wheelchair control by chromatic and higher frequency ssvepBCI by Daiki AMINAKA

This demo was made in collaboration with Cichocki Lab, at RIKEN Brain Science Institute, Japan.

In this video we show a preliminary trial of a research project in progress, where chromatic and higher frequency SSVEP is used.

Tactile-body BCI (tbBCI) control of a robot hand

"Spatial Tactile Brain-Computer Interface Paradigm Applying Vibration Stimuli to Large Areas of User's Back" - demo video accompanying a BCI Conference 2014 paper available at http://arxiv.org/abs/1404.4226. The video demonstrates the online tbBCI application with healthy users.

Virtual reality walk trial #2 in collaboration of Hiromu's tbaBCI, Kensuke's tfBCI & Waldir's VR app, and tested/inspired by Tomek

The second collaboration output in our research group of Waldir's virtual reality (neurogaming) app with Hiromu's tbaBCI and Kensuke's tfBCI paradigms under Tomek's project leadership.

Virtual reality walk (by Waldir) using tactile bone-conduction auditory BCI (by Hiromu) tested by Tomek

A perfect collaboration example of Waldir's virtual reality (neurogaming) app with Hiromu's tbaBCI paradigm developed in BCI-lab-group at University of Tsukuba, Japan, under Tomek's project leadership.

Tactile-force BCI (tfBCI) 2013

Tactile-force BCI demo with a small vehicular robot by Shota & Tomek (please note that the joystick is moving causing a tactile force stimulus, not the hand, as could be seen at the end of the demo movie).

Chest Tactile BCI (ctBCI) 2013

Chest Tactile BCI (tBCI) for Vehicle Robot Navigation by Hiromu & Tomek (please note the vibrotactile transducers attached to the user's chest which also generate, as a side effect, acoustic noises).

"Brain dreams Music" Performances 2011-2013

The "Brain dreams Music" project conducts the development of the new musical instrument which can be played by brain waves, the musical performance using this instrument, and the other related research which bridges music and neuroscience. Our instrument uses the most advanced Brain-Computer Interface (BCI) technology, that directly transforms the imagination of music into the realization of the imagined music. The audiovisual representation of the brain wave classification analysis facilitates the communication among the brain player, musicians, and audiences. This project is a joint research project between JST, ERATO, Okanoya Emotional Information Project, Tokyo University of the Arts, and BCI-lab-group at University of Tsukuba.

http://www.brain-dreams-music.net