A context-based emotion-analyzer for teaching tonality in music courses

Aya Ichinose, Shuichi Kurabayashi, Yasushi Kiyoki

研究成果: Conference contribution

抄録

This paper presents a context-based emotion-analyzer dedicated for supporting to learn tonality in music courses. This emotion-analyzer realizes a new music retrieval environment to find and visualize music items with considering genre dependent perceptual preference of music. This system generates emotive annotations for music by analyzing tonality along with a timeline. The system helps users to identify music tonality from the viewpoint of emotions. Tonality is a musical system that is constructed by sound elements, such as harmonies and melodies. Change of tonality causes change of impression. This system realizes an automatic time-duration selector that detects repetitions and bridges by analyzing physical and structural music features, such as pitch and tonality. This system enables users to submit emotive keywords as a query for retrieving music according to the impressive changes in each musical piece. This paper shows a prototype system that searches MIDI music files by analyzing them automatically. This paper also shows several experimental results for clarifying the feasibility of the system.

元の言語English
ホスト出版物のタイトルProceedings of the IASTED International Conference on Technology for Education, TE 2011
ページ37-45
ページ数9
DOI
出版物ステータスPublished - 2011
イベントIASTED International Conference on Technology for Education, TE 2011 - Dallas, TX, United States
継続期間: 2011 12 142011 12 16

Other

OtherIASTED International Conference on Technology for Education, TE 2011
United States
Dallas, TX
期間11/12/1411/12/16

Fingerprint

music
emotion
Teaching
genre
cause

ASJC Scopus subject areas

  • Education

これを引用

Ichinose, A., Kurabayashi, S., & Kiyoki, Y. (2011). A context-based emotion-analyzer for teaching tonality in music courses. : Proceedings of the IASTED International Conference on Technology for Education, TE 2011 (pp. 37-45) https://doi.org/10.2316/P.2011.754-041

A context-based emotion-analyzer for teaching tonality in music courses. / Ichinose, Aya; Kurabayashi, Shuichi; Kiyoki, Yasushi.

Proceedings of the IASTED International Conference on Technology for Education, TE 2011. 2011. p. 37-45.

研究成果: Conference contribution

Ichinose, A, Kurabayashi, S & Kiyoki, Y 2011, A context-based emotion-analyzer for teaching tonality in music courses. : Proceedings of the IASTED International Conference on Technology for Education, TE 2011. pp. 37-45, IASTED International Conference on Technology for Education, TE 2011, Dallas, TX, United States, 11/12/14. https://doi.org/10.2316/P.2011.754-041
Ichinose A, Kurabayashi S, Kiyoki Y. A context-based emotion-analyzer for teaching tonality in music courses. : Proceedings of the IASTED International Conference on Technology for Education, TE 2011. 2011. p. 37-45 https://doi.org/10.2316/P.2011.754-041
Ichinose, Aya ; Kurabayashi, Shuichi ; Kiyoki, Yasushi. / A context-based emotion-analyzer for teaching tonality in music courses. Proceedings of the IASTED International Conference on Technology for Education, TE 2011. 2011. pp. 37-45
@inproceedings{9c1dea065e05438bae0fdfbffb4d2366,
title = "A context-based emotion-analyzer for teaching tonality in music courses",
abstract = "This paper presents a context-based emotion-analyzer dedicated for supporting to learn tonality in music courses. This emotion-analyzer realizes a new music retrieval environment to find and visualize music items with considering genre dependent perceptual preference of music. This system generates emotive annotations for music by analyzing tonality along with a timeline. The system helps users to identify music tonality from the viewpoint of emotions. Tonality is a musical system that is constructed by sound elements, such as harmonies and melodies. Change of tonality causes change of impression. This system realizes an automatic time-duration selector that detects repetitions and bridges by analyzing physical and structural music features, such as pitch and tonality. This system enables users to submit emotive keywords as a query for retrieving music according to the impressive changes in each musical piece. This paper shows a prototype system that searches MIDI music files by analyzing them automatically. This paper also shows several experimental results for clarifying the feasibility of the system.",
keywords = "E-learning, Multimedia information systems, Music course-ware, Technology for education, Visualization",
author = "Aya Ichinose and Shuichi Kurabayashi and Yasushi Kiyoki",
year = "2011",
doi = "10.2316/P.2011.754-041",
language = "English",
isbn = "9780889868908",
pages = "37--45",
booktitle = "Proceedings of the IASTED International Conference on Technology for Education, TE 2011",

}

TY - GEN

T1 - A context-based emotion-analyzer for teaching tonality in music courses

AU - Ichinose, Aya

AU - Kurabayashi, Shuichi

AU - Kiyoki, Yasushi

PY - 2011

Y1 - 2011

N2 - This paper presents a context-based emotion-analyzer dedicated for supporting to learn tonality in music courses. This emotion-analyzer realizes a new music retrieval environment to find and visualize music items with considering genre dependent perceptual preference of music. This system generates emotive annotations for music by analyzing tonality along with a timeline. The system helps users to identify music tonality from the viewpoint of emotions. Tonality is a musical system that is constructed by sound elements, such as harmonies and melodies. Change of tonality causes change of impression. This system realizes an automatic time-duration selector that detects repetitions and bridges by analyzing physical and structural music features, such as pitch and tonality. This system enables users to submit emotive keywords as a query for retrieving music according to the impressive changes in each musical piece. This paper shows a prototype system that searches MIDI music files by analyzing them automatically. This paper also shows several experimental results for clarifying the feasibility of the system.

AB - This paper presents a context-based emotion-analyzer dedicated for supporting to learn tonality in music courses. This emotion-analyzer realizes a new music retrieval environment to find and visualize music items with considering genre dependent perceptual preference of music. This system generates emotive annotations for music by analyzing tonality along with a timeline. The system helps users to identify music tonality from the viewpoint of emotions. Tonality is a musical system that is constructed by sound elements, such as harmonies and melodies. Change of tonality causes change of impression. This system realizes an automatic time-duration selector that detects repetitions and bridges by analyzing physical and structural music features, such as pitch and tonality. This system enables users to submit emotive keywords as a query for retrieving music according to the impressive changes in each musical piece. This paper shows a prototype system that searches MIDI music files by analyzing them automatically. This paper also shows several experimental results for clarifying the feasibility of the system.

KW - E-learning

KW - Multimedia information systems

KW - Music course-ware

KW - Technology for education

KW - Visualization

UR - http://www.scopus.com/inward/record.url?scp=84862288716&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862288716&partnerID=8YFLogxK

U2 - 10.2316/P.2011.754-041

DO - 10.2316/P.2011.754-041

M3 - Conference contribution

AN - SCOPUS:84862288716

SN - 9780889868908

SP - 37

EP - 45

BT - Proceedings of the IASTED International Conference on Technology for Education, TE 2011

ER -