Quantified reading and learning for sharing experiences

Koichi Kise, Olivier Augereau, Yuzuko Utsumi, Masakazu Iwamura, Kai Steven Kunze, Shoya Ishimaru, Andreas Dengel

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

This paper presents two topics. The first is an overview of our recently started project called "experiential supplement", which is to transfer human experiences by recording and processing them to be acceptable by others. The second is sensing technologies for producing experiential supplements in the context of learning. Because a basic activity of learning is reading, we also deal with sensing of reading. Methods for quantifying the reading in terms of the number of read words, the period of reading, type of read documents, identifying read words are shown with experimental results. As for learning, we propose methods for estimating the English ability, confidence in answers to English questions, and estimating unknown words. The above are sensed by various sensors including eye trackers, EOG, EEG, and the first person vision. Copyright held by the owner/author(s).

Original languageEnglish
Title of host publicationUbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers
PublisherAssociation for Computing Machinery, Inc
Pages724-731
Number of pages8
ISBN (Electronic)9781450351904
DOIs
Publication statusPublished - 2017 Sep 11
Event2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and ACM International Symposium on Wearable Computers, UbiComp/ISWC 2017 - Maui, United States
Duration: 2017 Sep 112017 Sep 15

Other

Other2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and ACM International Symposium on Wearable Computers, UbiComp/ISWC 2017
CountryUnited States
CityMaui
Period17/9/1117/9/15

Fingerprint

Electroencephalography
Sensors
Processing

Keywords

  • Confidence
  • E-learing
  • EOG
  • Eye-tracking
  • Human experience
  • Read word identification
  • Reading detection
  • Toeic
  • Unknown words
  • Wordometer

ASJC Scopus subject areas

  • Software
  • Hardware and Architecture
  • Computer Networks and Communications

Cite this

Kise, K., Augereau, O., Utsumi, Y., Iwamura, M., Kunze, K. S., Ishimaru, S., & Dengel, A. (2017). Quantified reading and learning for sharing experiences. In UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers (pp. 724-731). Association for Computing Machinery, Inc. https://doi.org/10.1145/3123024.3129274

Quantified reading and learning for sharing experiences. / Kise, Koichi; Augereau, Olivier; Utsumi, Yuzuko; Iwamura, Masakazu; Kunze, Kai Steven; Ishimaru, Shoya; Dengel, Andreas.

UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. Association for Computing Machinery, Inc, 2017. p. 724-731.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Kise, K, Augereau, O, Utsumi, Y, Iwamura, M, Kunze, KS, Ishimaru, S & Dengel, A 2017, Quantified reading and learning for sharing experiences. in UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. Association for Computing Machinery, Inc, pp. 724-731, 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and ACM International Symposium on Wearable Computers, UbiComp/ISWC 2017, Maui, United States, 17/9/11. https://doi.org/10.1145/3123024.3129274
Kise K, Augereau O, Utsumi Y, Iwamura M, Kunze KS, Ishimaru S et al. Quantified reading and learning for sharing experiences. In UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. Association for Computing Machinery, Inc. 2017. p. 724-731 https://doi.org/10.1145/3123024.3129274
Kise, Koichi ; Augereau, Olivier ; Utsumi, Yuzuko ; Iwamura, Masakazu ; Kunze, Kai Steven ; Ishimaru, Shoya ; Dengel, Andreas. / Quantified reading and learning for sharing experiences. UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. Association for Computing Machinery, Inc, 2017. pp. 724-731
@inproceedings{fa716f8ea956445888852126efed67d7,
title = "Quantified reading and learning for sharing experiences",
abstract = "This paper presents two topics. The first is an overview of our recently started project called {"}experiential supplement{"}, which is to transfer human experiences by recording and processing them to be acceptable by others. The second is sensing technologies for producing experiential supplements in the context of learning. Because a basic activity of learning is reading, we also deal with sensing of reading. Methods for quantifying the reading in terms of the number of read words, the period of reading, type of read documents, identifying read words are shown with experimental results. As for learning, we propose methods for estimating the English ability, confidence in answers to English questions, and estimating unknown words. The above are sensed by various sensors including eye trackers, EOG, EEG, and the first person vision. Copyright held by the owner/author(s).",
keywords = "Confidence, E-learing, EOG, Eye-tracking, Human experience, Read word identification, Reading detection, Toeic, Unknown words, Wordometer",
author = "Koichi Kise and Olivier Augereau and Yuzuko Utsumi and Masakazu Iwamura and Kunze, {Kai Steven} and Shoya Ishimaru and Andreas Dengel",
year = "2017",
month = "9",
day = "11",
doi = "10.1145/3123024.3129274",
language = "English",
pages = "724--731",
booktitle = "UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers",
publisher = "Association for Computing Machinery, Inc",

}

TY - GEN

T1 - Quantified reading and learning for sharing experiences

AU - Kise, Koichi

AU - Augereau, Olivier

AU - Utsumi, Yuzuko

AU - Iwamura, Masakazu

AU - Kunze, Kai Steven

AU - Ishimaru, Shoya

AU - Dengel, Andreas

PY - 2017/9/11

Y1 - 2017/9/11

N2 - This paper presents two topics. The first is an overview of our recently started project called "experiential supplement", which is to transfer human experiences by recording and processing them to be acceptable by others. The second is sensing technologies for producing experiential supplements in the context of learning. Because a basic activity of learning is reading, we also deal with sensing of reading. Methods for quantifying the reading in terms of the number of read words, the period of reading, type of read documents, identifying read words are shown with experimental results. As for learning, we propose methods for estimating the English ability, confidence in answers to English questions, and estimating unknown words. The above are sensed by various sensors including eye trackers, EOG, EEG, and the first person vision. Copyright held by the owner/author(s).

AB - This paper presents two topics. The first is an overview of our recently started project called "experiential supplement", which is to transfer human experiences by recording and processing them to be acceptable by others. The second is sensing technologies for producing experiential supplements in the context of learning. Because a basic activity of learning is reading, we also deal with sensing of reading. Methods for quantifying the reading in terms of the number of read words, the period of reading, type of read documents, identifying read words are shown with experimental results. As for learning, we propose methods for estimating the English ability, confidence in answers to English questions, and estimating unknown words. The above are sensed by various sensors including eye trackers, EOG, EEG, and the first person vision. Copyright held by the owner/author(s).

KW - Confidence

KW - E-learing

KW - EOG

KW - Eye-tracking

KW - Human experience

KW - Read word identification

KW - Reading detection

KW - Toeic

KW - Unknown words

KW - Wordometer

UR - http://www.scopus.com/inward/record.url?scp=85030873559&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85030873559&partnerID=8YFLogxK

U2 - 10.1145/3123024.3129274

DO - 10.1145/3123024.3129274

M3 - Conference contribution

AN - SCOPUS:85030873559

SP - 724

EP - 731

BT - UbiComp/ISWC 2017 - Adjunct Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers

PB - Association for Computing Machinery, Inc

ER -