TY - GEN
T1 - Kohonen feature maps as a supervised learning machine
AU - Ichiki, Hiroyuki
AU - Hagiwara, Masafumi
AU - Nakagawa, Masao
PY - 1993/1/1
Y1 - 1993/1/1
N2 - Kohonen Feature Maps as a Supervised Learning Machine are proposed and discussed. The proposed models adopt the supervised learning without modifying the basic learning algorithm. Therefore they behave as a supervised learning machine, which can learnt input-output functions in addition to the characteristics of the conventional method that is to say structuring a pattern recognition after preprocessing by the Kohenen Feature Map. In addition, the proposed models don't distinguish the input vectors from the desired vectors because they regard them as the same kind of vectors. That enables their bidirectional associations. And, we simulated several examples in order to compare with the conventional supervised learning machines. The results indicate the effectiveness of the proposed models. For example, the property to noise, capacity storage and so on. We confirmed that the proposed models had better characteristics than the conventional models (Back propagation network for a pattern recognition and BAM for an associative memory).
AB - Kohonen Feature Maps as a Supervised Learning Machine are proposed and discussed. The proposed models adopt the supervised learning without modifying the basic learning algorithm. Therefore they behave as a supervised learning machine, which can learnt input-output functions in addition to the characteristics of the conventional method that is to say structuring a pattern recognition after preprocessing by the Kohenen Feature Map. In addition, the proposed models don't distinguish the input vectors from the desired vectors because they regard them as the same kind of vectors. That enables their bidirectional associations. And, we simulated several examples in order to compare with the conventional supervised learning machines. The results indicate the effectiveness of the proposed models. For example, the property to noise, capacity storage and so on. We confirmed that the proposed models had better characteristics than the conventional models (Back propagation network for a pattern recognition and BAM for an associative memory).
UR - http://www.scopus.com/inward/record.url?scp=0027308562&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0027308562&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:0027308562
SN - 0780312007
T3 - 1993 IEEE International Conference on Neural Networks
SP - 1944
EP - 1948
BT - 1993 IEEE International Conference on Neural Networks
PB - Publ by IEEE
T2 - 1993 IEEE International Conference on Neural Networks
Y2 - 28 March 1993 through 1 April 1993
ER -