Science of Human and Communication
Exhibition Program 24
Reading minds from unconscious eye movements
Decoding implicit mind from fixational eye movements
Abstract
Eye is known as “one part of brain” and reflects various types of our emotion or perceptual states. Instead of using brain-imaging methods such as Electroencephalography (EEG), we used eye-movement and pupil size data, which can be measured with high signal-to-noise ratio using low-cost equipment, to decode mental states. However, eye movement data were thought to convey less amount of information than EEG data, due to their signal properties or the fewer number of recording channels. Here, we developed new feature extraction methods for microsaccade (small, rapid, involuntary eye movements) and pupillary response, based on a control theoretic model. Computing the novel features as additional information enables us to decode various types of our emotion or perceptual states (preference, attention, and drowsiness etc.) from eye measurement. Using this technology, we aim to create an AI which has an emotion recognition ability that are superior to human beings.
Photos
Poster
Please click the thumbnail image to open the full-size PDF file.
Presenters
Makoto Yoneya
Human Information Science Laboratory
Hsin-I Liao
Human Information Science Laboratory
Yuuki Oishi
Human Information Science Laboratory
Oral Presentations:
Eisaku Maeda (Director's Talk) |
Tomoharu Iwata |
Takuhiro Kaneko |
Makio Kashino |
Takashi G. Sato |
Exhibition:
1
|
2
|
3
|
4
|
5
|
6
|
7
|
8
|
9
|
10
|
11
|
12
|
13
|
14
|
15
|
16
|
17
|
18
|
19
|
20
|
21
|
22
|
23
|
24
|
25
|
26
|
27
|
28
|
29
Prev |
Next