A context-based emotion-analyzer for teaching tonality in music courses

Aya Ichinose, Shuichi Kurabayashi, Yasushi Kiyoki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper presents a context-based emotion-analyzer dedicated for supporting to learn tonality in music courses. This emotion-analyzer realizes a new music retrieval environment to find and visualize music items with considering genre dependent perceptual preference of music. This system generates emotive annotations for music by analyzing tonality along with a timeline. The system helps users to identify music tonality from the viewpoint of emotions. Tonality is a musical system that is constructed by sound elements, such as harmonies and melodies. Change of tonality causes change of impression. This system realizes an automatic time-duration selector that detects repetitions and bridges by analyzing physical and structural music features, such as pitch and tonality. This system enables users to submit emotive keywords as a query for retrieving music according to the impressive changes in each musical piece. This paper shows a prototype system that searches MIDI music files by analyzing them automatically. This paper also shows several experimental results for clarifying the feasibility of the system.

Original languageEnglish
Title of host publicationProceedings of the IASTED International Conference on Technology for Education, TE 2011
Pages37-45
Number of pages9
DOIs
Publication statusPublished - 2011
EventIASTED International Conference on Technology for Education, TE 2011 - Dallas, TX, United States
Duration: 2011 Dec 142011 Dec 16

Other

OtherIASTED International Conference on Technology for Education, TE 2011
CountryUnited States
CityDallas, TX
Period11/12/1411/12/16

Fingerprint

music
emotion
Teaching
genre
cause

Keywords

  • E-learning
  • Multimedia information systems
  • Music course-ware
  • Technology for education
  • Visualization

ASJC Scopus subject areas

  • Education

Cite this

Ichinose, A., Kurabayashi, S., & Kiyoki, Y. (2011). A context-based emotion-analyzer for teaching tonality in music courses. In Proceedings of the IASTED International Conference on Technology for Education, TE 2011 (pp. 37-45) https://doi.org/10.2316/P.2011.754-041

A context-based emotion-analyzer for teaching tonality in music courses. / Ichinose, Aya; Kurabayashi, Shuichi; Kiyoki, Yasushi.

Proceedings of the IASTED International Conference on Technology for Education, TE 2011. 2011. p. 37-45.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ichinose, A, Kurabayashi, S & Kiyoki, Y 2011, A context-based emotion-analyzer for teaching tonality in music courses. in Proceedings of the IASTED International Conference on Technology for Education, TE 2011. pp. 37-45, IASTED International Conference on Technology for Education, TE 2011, Dallas, TX, United States, 11/12/14. https://doi.org/10.2316/P.2011.754-041
Ichinose A, Kurabayashi S, Kiyoki Y. A context-based emotion-analyzer for teaching tonality in music courses. In Proceedings of the IASTED International Conference on Technology for Education, TE 2011. 2011. p. 37-45 https://doi.org/10.2316/P.2011.754-041
Ichinose, Aya ; Kurabayashi, Shuichi ; Kiyoki, Yasushi. / A context-based emotion-analyzer for teaching tonality in music courses. Proceedings of the IASTED International Conference on Technology for Education, TE 2011. 2011. pp. 37-45
@inproceedings{9c1dea065e05438bae0fdfbffb4d2366,
title = "A context-based emotion-analyzer for teaching tonality in music courses",
abstract = "This paper presents a context-based emotion-analyzer dedicated for supporting to learn tonality in music courses. This emotion-analyzer realizes a new music retrieval environment to find and visualize music items with considering genre dependent perceptual preference of music. This system generates emotive annotations for music by analyzing tonality along with a timeline. The system helps users to identify music tonality from the viewpoint of emotions. Tonality is a musical system that is constructed by sound elements, such as harmonies and melodies. Change of tonality causes change of impression. This system realizes an automatic time-duration selector that detects repetitions and bridges by analyzing physical and structural music features, such as pitch and tonality. This system enables users to submit emotive keywords as a query for retrieving music according to the impressive changes in each musical piece. This paper shows a prototype system that searches MIDI music files by analyzing them automatically. This paper also shows several experimental results for clarifying the feasibility of the system.",
keywords = "E-learning, Multimedia information systems, Music course-ware, Technology for education, Visualization",
author = "Aya Ichinose and Shuichi Kurabayashi and Yasushi Kiyoki",
year = "2011",
doi = "10.2316/P.2011.754-041",
language = "English",
isbn = "9780889868908",
pages = "37--45",
booktitle = "Proceedings of the IASTED International Conference on Technology for Education, TE 2011",

}

TY - GEN

T1 - A context-based emotion-analyzer for teaching tonality in music courses

AU - Ichinose, Aya

AU - Kurabayashi, Shuichi

AU - Kiyoki, Yasushi

PY - 2011

Y1 - 2011

N2 - This paper presents a context-based emotion-analyzer dedicated for supporting to learn tonality in music courses. This emotion-analyzer realizes a new music retrieval environment to find and visualize music items with considering genre dependent perceptual preference of music. This system generates emotive annotations for music by analyzing tonality along with a timeline. The system helps users to identify music tonality from the viewpoint of emotions. Tonality is a musical system that is constructed by sound elements, such as harmonies and melodies. Change of tonality causes change of impression. This system realizes an automatic time-duration selector that detects repetitions and bridges by analyzing physical and structural music features, such as pitch and tonality. This system enables users to submit emotive keywords as a query for retrieving music according to the impressive changes in each musical piece. This paper shows a prototype system that searches MIDI music files by analyzing them automatically. This paper also shows several experimental results for clarifying the feasibility of the system.

AB - This paper presents a context-based emotion-analyzer dedicated for supporting to learn tonality in music courses. This emotion-analyzer realizes a new music retrieval environment to find and visualize music items with considering genre dependent perceptual preference of music. This system generates emotive annotations for music by analyzing tonality along with a timeline. The system helps users to identify music tonality from the viewpoint of emotions. Tonality is a musical system that is constructed by sound elements, such as harmonies and melodies. Change of tonality causes change of impression. This system realizes an automatic time-duration selector that detects repetitions and bridges by analyzing physical and structural music features, such as pitch and tonality. This system enables users to submit emotive keywords as a query for retrieving music according to the impressive changes in each musical piece. This paper shows a prototype system that searches MIDI music files by analyzing them automatically. This paper also shows several experimental results for clarifying the feasibility of the system.

KW - E-learning

KW - Multimedia information systems

KW - Music course-ware

KW - Technology for education

KW - Visualization

UR - http://www.scopus.com/inward/record.url?scp=84862288716&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862288716&partnerID=8YFLogxK

U2 - 10.2316/P.2011.754-041

DO - 10.2316/P.2011.754-041

M3 - Conference contribution

SN - 9780889868908

SP - 37

EP - 45

BT - Proceedings of the IASTED International Conference on Technology for Education, TE 2011

ER -