An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition

Asako Ijichi, Yasushi Kiyoki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In the design of multimedia database systems, one of the most important issues is how to deal with "Kansei" and "impression" in human beings. The concept of "Kansei" and "impression" includes several meanings on sensitive recognition, such as human senses, feelings, sensitivity and psychological reactions. In this paper, we propose an automatic metadata-generation method for extracting impressions of music, such as "agitated," "joyous," "lyrical," "melancholy," and "sentimental," for semantically retrieving music data according to human's impression. We also present an impression-metadata-generation mechanism for reflecting impression transition occurring as time passes, that is, as temporal transition of a story in music (music-story). This mechanism is used to compute the impression-strength reflecting the impression transition, that is, "impression-stream" as a temporal transition of a music-story. Our automatic metadata-generation for a music-story consists of the following processes: (1) Division of a music-story into sections (2) Impression-metadata extraction for each section (3) Computation of impression-strength of impression-metadata (4) Weighting impression-metadata according to impression-strength (5) Combining impression-metadata for adjusting themselves to a query structure. Music data with a story consists of several sections, and each section gives an individual impression. The combination of sections gives a global impression of music data. Our metadata-generation method computes correlations between music data and impression words by reflecting the degree of changes of impressions among continuous sections. This paper shows several experimental results of metadata generation to clarify the feasibility and effectiveness of our method.

Original languageEnglish
Title of host publicationProceedings of the Seventh IASTED International Conference on Internet and Multimedia Systems and Applications
EditorsM.H. Hamza
Pages281-288
Number of pages8
Publication statusPublished - 2003 Dec 1
EventProceedings of the Seventh IASTED International Conference on Internet and Multimedia Systems and Applications - Honolulu, HI, United States
Duration: 2003 Aug 132003 Aug 15

Publication series

NameProceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications
Volume7

Other

OtherProceedings of the Seventh IASTED International Conference on Internet and Multimedia Systems and Applications
CountryUnited States
CityHonolulu, HI
Period03/8/1303/8/15

Keywords

  • Impression
  • Metadata
  • Music database
  • Story

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint Dive into the research topics of 'An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition'. Together they form a unique fingerprint.

  • Cite this

    Ijichi, A., & Kiyoki, Y. (2003). An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition. In M. H. Hamza (Ed.), Proceedings of the Seventh IASTED International Conference on Internet and Multimedia Systems and Applications (pp. 281-288). (Proceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications; Vol. 7).