Kohonen feature maps as a supervised learning machine

Hiroyuki Ichiki, Masafumi Hagiwara, Masao Nakagawa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

31 Citations (Scopus)

Abstract

Kohonen Feature Maps as a Supervised Learning Machine are proposed and discussed. The proposed models adopt the supervised learning without modifying the basic learning algorithm. Therefore they behave as a supervised learning machine, which can learnt input-output functions in addition to the characteristics of the conventional method that is to say structuring a pattern recognition after preprocessing by the Kohenen Feature Map. In addition, the proposed models don't distinguish the input vectors from the desired vectors because they regard them as the same kind of vectors. That enables their bidirectional associations. And, we simulated several examples in order to compare with the conventional supervised learning machines. The results indicate the effectiveness of the proposed models. For example, the property to noise, capacity storage and so on. We confirmed that the proposed models had better characteristics than the conventional models (Back propagation network for a pattern recognition and BAM for an associative memory).

Original languageEnglish
Title of host publication1993 IEEE International Conference on Neural Networks
PublisherPubl by IEEE
Pages1944-1948
Number of pages5
ISBN (Print)0780312007
Publication statusPublished - 1993
Event1993 IEEE International Conference on Neural Networks - San Francisco, California, USA
Duration: 1993 Mar 281993 Apr 1

Other

Other1993 IEEE International Conference on Neural Networks
CitySan Francisco, California, USA
Period93/3/2893/4/1

Fingerprint

Supervised learning
Pattern recognition
Backpropagation
Learning algorithms
Data storage equipment

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Ichiki, H., Hagiwara, M., & Nakagawa, M. (1993). Kohonen feature maps as a supervised learning machine. In 1993 IEEE International Conference on Neural Networks (pp. 1944-1948). Publ by IEEE.

Kohonen feature maps as a supervised learning machine. / Ichiki, Hiroyuki; Hagiwara, Masafumi; Nakagawa, Masao.

1993 IEEE International Conference on Neural Networks. Publ by IEEE, 1993. p. 1944-1948.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ichiki, H, Hagiwara, M & Nakagawa, M 1993, Kohonen feature maps as a supervised learning machine. in 1993 IEEE International Conference on Neural Networks. Publ by IEEE, pp. 1944-1948, 1993 IEEE International Conference on Neural Networks, San Francisco, California, USA, 93/3/28.
Ichiki H, Hagiwara M, Nakagawa M. Kohonen feature maps as a supervised learning machine. In 1993 IEEE International Conference on Neural Networks. Publ by IEEE. 1993. p. 1944-1948
Ichiki, Hiroyuki ; Hagiwara, Masafumi ; Nakagawa, Masao. / Kohonen feature maps as a supervised learning machine. 1993 IEEE International Conference on Neural Networks. Publ by IEEE, 1993. pp. 1944-1948
@inproceedings{f553bad15acf41179e2979b299ac927f,
title = "Kohonen feature maps as a supervised learning machine",
abstract = "Kohonen Feature Maps as a Supervised Learning Machine are proposed and discussed. The proposed models adopt the supervised learning without modifying the basic learning algorithm. Therefore they behave as a supervised learning machine, which can learnt input-output functions in addition to the characteristics of the conventional method that is to say structuring a pattern recognition after preprocessing by the Kohenen Feature Map. In addition, the proposed models don't distinguish the input vectors from the desired vectors because they regard them as the same kind of vectors. That enables their bidirectional associations. And, we simulated several examples in order to compare with the conventional supervised learning machines. The results indicate the effectiveness of the proposed models. For example, the property to noise, capacity storage and so on. We confirmed that the proposed models had better characteristics than the conventional models (Back propagation network for a pattern recognition and BAM for an associative memory).",
author = "Hiroyuki Ichiki and Masafumi Hagiwara and Masao Nakagawa",
year = "1993",
language = "English",
isbn = "0780312007",
pages = "1944--1948",
booktitle = "1993 IEEE International Conference on Neural Networks",
publisher = "Publ by IEEE",

}

TY - GEN

T1 - Kohonen feature maps as a supervised learning machine

AU - Ichiki, Hiroyuki

AU - Hagiwara, Masafumi

AU - Nakagawa, Masao

PY - 1993

Y1 - 1993

N2 - Kohonen Feature Maps as a Supervised Learning Machine are proposed and discussed. The proposed models adopt the supervised learning without modifying the basic learning algorithm. Therefore they behave as a supervised learning machine, which can learnt input-output functions in addition to the characteristics of the conventional method that is to say structuring a pattern recognition after preprocessing by the Kohenen Feature Map. In addition, the proposed models don't distinguish the input vectors from the desired vectors because they regard them as the same kind of vectors. That enables their bidirectional associations. And, we simulated several examples in order to compare with the conventional supervised learning machines. The results indicate the effectiveness of the proposed models. For example, the property to noise, capacity storage and so on. We confirmed that the proposed models had better characteristics than the conventional models (Back propagation network for a pattern recognition and BAM for an associative memory).

AB - Kohonen Feature Maps as a Supervised Learning Machine are proposed and discussed. The proposed models adopt the supervised learning without modifying the basic learning algorithm. Therefore they behave as a supervised learning machine, which can learnt input-output functions in addition to the characteristics of the conventional method that is to say structuring a pattern recognition after preprocessing by the Kohenen Feature Map. In addition, the proposed models don't distinguish the input vectors from the desired vectors because they regard them as the same kind of vectors. That enables their bidirectional associations. And, we simulated several examples in order to compare with the conventional supervised learning machines. The results indicate the effectiveness of the proposed models. For example, the property to noise, capacity storage and so on. We confirmed that the proposed models had better characteristics than the conventional models (Back propagation network for a pattern recognition and BAM for an associative memory).

UR - http://www.scopus.com/inward/record.url?scp=0027308562&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0027308562&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0027308562

SN - 0780312007

SP - 1944

EP - 1948

BT - 1993 IEEE International Conference on Neural Networks

PB - Publ by IEEE

ER -