An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition

Asako Ijichi, Yasushi Kiyoki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In the design of multimedia database systems, one of the most important issues is how to deal with "Kansei" and "impression" in human beings. The concept of "Kansei" and "impression" includes several meanings on sensitive recognition, such as human senses, feelings, sensitivity and psychological reactions. In this paper, we propose an automatic metadata-generation method for extracting impressions of music, such as "agitated," "joyous," "lyrical," "melancholy," and "sentimental," for semantically retrieving music data according to human's impression. We also present an impression-metadata-generation mechanism for reflecting impression transition occurring as time passes, that is, as temporal transition of a story in music (music-story). This mechanism is used to compute the impression-strength reflecting the impression transition, that is, "impression-stream" as a temporal transition of a music-story. Our automatic metadata-generation for a music-story consists of the following processes: (1) Division of a music-story into sections (2) Impression-metadata extraction for each section (3) Computation of impression-strength of impression-metadata (4) Weighting impression-metadata according to impression-strength (5) Combining impression-metadata for adjusting themselves to a query structure. Music data with a story consists of several sections, and each section gives an individual impression. The combination of sections gives a global impression of music data. Our metadata-generation method computes correlations between music data and impression words by reflecting the degree of changes of impressions among continuous sections. This paper shows several experimental results of metadata generation to clarify the feasibility and effectiveness of our method.

Original languageEnglish
Title of host publicationProceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications
EditorsM.H. Hamza
Pages281-288
Number of pages8
Volume7
Publication statusPublished - 2003
EventProceedings of the Seventh IASTED International Conference on Internet and Multimedia Systems and Applications - Honolulu, HI, United States
Duration: 2003 Aug 132003 Aug 15

Other

OtherProceedings of the Seventh IASTED International Conference on Internet and Multimedia Systems and Applications
CountryUnited States
CityHonolulu, HI
Period03/8/1303/8/15

Fingerprint

music
metadata
Metadata
method
Correlation methods
multimedia
weighting

Keywords

  • Impression
  • Metadata
  • Music database
  • Story

ASJC Scopus subject areas

  • Development
  • Computer Networks and Communications

Cite this

Ijichi, A., & Kiyoki, Y. (2003). An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition. In M. H. Hamza (Ed.), Proceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications (Vol. 7, pp. 281-288)

An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition. / Ijichi, Asako; Kiyoki, Yasushi.

Proceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications. ed. / M.H. Hamza. Vol. 7 2003. p. 281-288.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ijichi, A & Kiyoki, Y 2003, An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition. in MH Hamza (ed.), Proceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications. vol. 7, pp. 281-288, Proceedings of the Seventh IASTED International Conference on Internet and Multimedia Systems and Applications, Honolulu, HI, United States, 03/8/13.
Ijichi A, Kiyoki Y. An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition. In Hamza MH, editor, Proceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications. Vol. 7. 2003. p. 281-288
Ijichi, Asako ; Kiyoki, Yasushi. / An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition. Proceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications. editor / M.H. Hamza. Vol. 7 2003. pp. 281-288
@inproceedings{9f7cb1c05bdb402bbe886a552484481a,
title = "An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition",
abstract = "In the design of multimedia database systems, one of the most important issues is how to deal with {"}Kansei{"} and {"}impression{"} in human beings. The concept of {"}Kansei{"} and {"}impression{"} includes several meanings on sensitive recognition, such as human senses, feelings, sensitivity and psychological reactions. In this paper, we propose an automatic metadata-generation method for extracting impressions of music, such as {"}agitated,{"} {"}joyous,{"} {"}lyrical,{"} {"}melancholy,{"} and {"}sentimental,{"} for semantically retrieving music data according to human's impression. We also present an impression-metadata-generation mechanism for reflecting impression transition occurring as time passes, that is, as temporal transition of a story in music (music-story). This mechanism is used to compute the impression-strength reflecting the impression transition, that is, {"}impression-stream{"} as a temporal transition of a music-story. Our automatic metadata-generation for a music-story consists of the following processes: (1) Division of a music-story into sections (2) Impression-metadata extraction for each section (3) Computation of impression-strength of impression-metadata (4) Weighting impression-metadata according to impression-strength (5) Combining impression-metadata for adjusting themselves to a query structure. Music data with a story consists of several sections, and each section gives an individual impression. The combination of sections gives a global impression of music data. Our metadata-generation method computes correlations between music data and impression words by reflecting the degree of changes of impressions among continuous sections. This paper shows several experimental results of metadata generation to clarify the feasibility and effectiveness of our method.",
keywords = "Impression, Metadata, Music database, Story",
author = "Asako Ijichi and Yasushi Kiyoki",
year = "2003",
language = "English",
isbn = "0889863806",
volume = "7",
pages = "281--288",
editor = "M.H. Hamza",
booktitle = "Proceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications",

}

TY - GEN

T1 - An automatic metadata generation method for music retrieval-by-impression dealing with impression-transition

AU - Ijichi, Asako

AU - Kiyoki, Yasushi

PY - 2003

Y1 - 2003

N2 - In the design of multimedia database systems, one of the most important issues is how to deal with "Kansei" and "impression" in human beings. The concept of "Kansei" and "impression" includes several meanings on sensitive recognition, such as human senses, feelings, sensitivity and psychological reactions. In this paper, we propose an automatic metadata-generation method for extracting impressions of music, such as "agitated," "joyous," "lyrical," "melancholy," and "sentimental," for semantically retrieving music data according to human's impression. We also present an impression-metadata-generation mechanism for reflecting impression transition occurring as time passes, that is, as temporal transition of a story in music (music-story). This mechanism is used to compute the impression-strength reflecting the impression transition, that is, "impression-stream" as a temporal transition of a music-story. Our automatic metadata-generation for a music-story consists of the following processes: (1) Division of a music-story into sections (2) Impression-metadata extraction for each section (3) Computation of impression-strength of impression-metadata (4) Weighting impression-metadata according to impression-strength (5) Combining impression-metadata for adjusting themselves to a query structure. Music data with a story consists of several sections, and each section gives an individual impression. The combination of sections gives a global impression of music data. Our metadata-generation method computes correlations between music data and impression words by reflecting the degree of changes of impressions among continuous sections. This paper shows several experimental results of metadata generation to clarify the feasibility and effectiveness of our method.

AB - In the design of multimedia database systems, one of the most important issues is how to deal with "Kansei" and "impression" in human beings. The concept of "Kansei" and "impression" includes several meanings on sensitive recognition, such as human senses, feelings, sensitivity and psychological reactions. In this paper, we propose an automatic metadata-generation method for extracting impressions of music, such as "agitated," "joyous," "lyrical," "melancholy," and "sentimental," for semantically retrieving music data according to human's impression. We also present an impression-metadata-generation mechanism for reflecting impression transition occurring as time passes, that is, as temporal transition of a story in music (music-story). This mechanism is used to compute the impression-strength reflecting the impression transition, that is, "impression-stream" as a temporal transition of a music-story. Our automatic metadata-generation for a music-story consists of the following processes: (1) Division of a music-story into sections (2) Impression-metadata extraction for each section (3) Computation of impression-strength of impression-metadata (4) Weighting impression-metadata according to impression-strength (5) Combining impression-metadata for adjusting themselves to a query structure. Music data with a story consists of several sections, and each section gives an individual impression. The combination of sections gives a global impression of music data. Our metadata-generation method computes correlations between music data and impression words by reflecting the degree of changes of impressions among continuous sections. This paper shows several experimental results of metadata generation to clarify the feasibility and effectiveness of our method.

KW - Impression

KW - Metadata

KW - Music database

KW - Story

UR - http://www.scopus.com/inward/record.url?scp=1542433063&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=1542433063&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:1542433063

SN - 0889863806

VL - 7

SP - 281

EP - 288

BT - Proceedings of the IASTED International Conference on Internet and Multimedia Systems and Applications

A2 - Hamza, M.H.

ER -