On flexible neural networks: Some system-theoretic properties and a new class

Yazdan Bavafa-Toosi, Hiromitsu Ohmori

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Although flexible neural networks (FNNs) have been used more successfully than classical neural networks (CNNs), nothing is rigorously known about their properties. In fact, they are not even well known to the systems and control community. In this paper, theoretical evidence is given for their superiority over CNNs. Following an overview of flexible bipolar sigmoid functions (FBSFs), several fundamental properties of feedforward and recurrent FNNs are established. For the feedforward case, it is proven that similar to CNNs, FNNs with as few as a single hidden layer (SHL) are universal approximators. It is also proven that unlike irreducible SHL CBSNNs, irreducible SHL FBSNNs are nonuniquely determined by their input-output (I-O) maps, up to a finite group of symmetries. Then, recurrent FNNs are introduced. It is observed that they can be interpreted as a generalization of the conventional state-space framework. For the recurrent case, it is substantiated that similar to CBSNNs, FBSNNs are universal approximators. Necessary and sufficient conditions for the controllability and observability of a generic class of them are established. For a subclass of this class, it is proven that unlike CBSNNs, FBSNNs are nonuniquely determined by their I-O maps, up to a finite group of symmetries, and that every system inside this subclass is minimal. Finally, a new class of FNNs, namely, flexible bipolar radial basis neural networks (FBRBNNs) is introduced. It is proven that as in the case of classical radial basis neural networks (CRBNNs), feedforward SHL FBRBNNs are universal approximators.

Original languageEnglish
Title of host publicationProceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05
Pages2554-2561
Number of pages8
Volume2005
DOIs
Publication statusPublished - 2005
Event44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05 - Seville, Spain
Duration: 2005 Dec 122005 Dec 15

Other

Other44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05
CountrySpain
CitySeville
Period05/12/1205/12/15

Fingerprint

Neural networks
Feedforward neural networks
Observability
Controllability

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Bavafa-Toosi, Y., & Ohmori, H. (2005). On flexible neural networks: Some system-theoretic properties and a new class. In Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05 (Vol. 2005, pp. 2554-2561). [1582547] https://doi.org/10.1109/CDC.2005.1582547

On flexible neural networks : Some system-theoretic properties and a new class. / Bavafa-Toosi, Yazdan; Ohmori, Hiromitsu.

Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05. Vol. 2005 2005. p. 2554-2561 1582547.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Bavafa-Toosi, Y & Ohmori, H 2005, On flexible neural networks: Some system-theoretic properties and a new class. in Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05. vol. 2005, 1582547, pp. 2554-2561, 44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05, Seville, Spain, 05/12/12. https://doi.org/10.1109/CDC.2005.1582547
Bavafa-Toosi Y, Ohmori H. On flexible neural networks: Some system-theoretic properties and a new class. In Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05. Vol. 2005. 2005. p. 2554-2561. 1582547 https://doi.org/10.1109/CDC.2005.1582547
Bavafa-Toosi, Yazdan ; Ohmori, Hiromitsu. / On flexible neural networks : Some system-theoretic properties and a new class. Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05. Vol. 2005 2005. pp. 2554-2561
@inproceedings{3f60b83bd37b41c9b1f3485b173256f2,
title = "On flexible neural networks: Some system-theoretic properties and a new class",
abstract = "Although flexible neural networks (FNNs) have been used more successfully than classical neural networks (CNNs), nothing is rigorously known about their properties. In fact, they are not even well known to the systems and control community. In this paper, theoretical evidence is given for their superiority over CNNs. Following an overview of flexible bipolar sigmoid functions (FBSFs), several fundamental properties of feedforward and recurrent FNNs are established. For the feedforward case, it is proven that similar to CNNs, FNNs with as few as a single hidden layer (SHL) are universal approximators. It is also proven that unlike irreducible SHL CBSNNs, irreducible SHL FBSNNs are nonuniquely determined by their input-output (I-O) maps, up to a finite group of symmetries. Then, recurrent FNNs are introduced. It is observed that they can be interpreted as a generalization of the conventional state-space framework. For the recurrent case, it is substantiated that similar to CBSNNs, FBSNNs are universal approximators. Necessary and sufficient conditions for the controllability and observability of a generic class of them are established. For a subclass of this class, it is proven that unlike CBSNNs, FBSNNs are nonuniquely determined by their I-O maps, up to a finite group of symmetries, and that every system inside this subclass is minimal. Finally, a new class of FNNs, namely, flexible bipolar radial basis neural networks (FBRBNNs) is introduced. It is proven that as in the case of classical radial basis neural networks (CRBNNs), feedforward SHL FBRBNNs are universal approximators.",
author = "Yazdan Bavafa-Toosi and Hiromitsu Ohmori",
year = "2005",
doi = "10.1109/CDC.2005.1582547",
language = "English",
isbn = "0780395689",
volume = "2005",
pages = "2554--2561",
booktitle = "Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05",

}

TY - GEN

T1 - On flexible neural networks

T2 - Some system-theoretic properties and a new class

AU - Bavafa-Toosi, Yazdan

AU - Ohmori, Hiromitsu

PY - 2005

Y1 - 2005

N2 - Although flexible neural networks (FNNs) have been used more successfully than classical neural networks (CNNs), nothing is rigorously known about their properties. In fact, they are not even well known to the systems and control community. In this paper, theoretical evidence is given for their superiority over CNNs. Following an overview of flexible bipolar sigmoid functions (FBSFs), several fundamental properties of feedforward and recurrent FNNs are established. For the feedforward case, it is proven that similar to CNNs, FNNs with as few as a single hidden layer (SHL) are universal approximators. It is also proven that unlike irreducible SHL CBSNNs, irreducible SHL FBSNNs are nonuniquely determined by their input-output (I-O) maps, up to a finite group of symmetries. Then, recurrent FNNs are introduced. It is observed that they can be interpreted as a generalization of the conventional state-space framework. For the recurrent case, it is substantiated that similar to CBSNNs, FBSNNs are universal approximators. Necessary and sufficient conditions for the controllability and observability of a generic class of them are established. For a subclass of this class, it is proven that unlike CBSNNs, FBSNNs are nonuniquely determined by their I-O maps, up to a finite group of symmetries, and that every system inside this subclass is minimal. Finally, a new class of FNNs, namely, flexible bipolar radial basis neural networks (FBRBNNs) is introduced. It is proven that as in the case of classical radial basis neural networks (CRBNNs), feedforward SHL FBRBNNs are universal approximators.

AB - Although flexible neural networks (FNNs) have been used more successfully than classical neural networks (CNNs), nothing is rigorously known about their properties. In fact, they are not even well known to the systems and control community. In this paper, theoretical evidence is given for their superiority over CNNs. Following an overview of flexible bipolar sigmoid functions (FBSFs), several fundamental properties of feedforward and recurrent FNNs are established. For the feedforward case, it is proven that similar to CNNs, FNNs with as few as a single hidden layer (SHL) are universal approximators. It is also proven that unlike irreducible SHL CBSNNs, irreducible SHL FBSNNs are nonuniquely determined by their input-output (I-O) maps, up to a finite group of symmetries. Then, recurrent FNNs are introduced. It is observed that they can be interpreted as a generalization of the conventional state-space framework. For the recurrent case, it is substantiated that similar to CBSNNs, FBSNNs are universal approximators. Necessary and sufficient conditions for the controllability and observability of a generic class of them are established. For a subclass of this class, it is proven that unlike CBSNNs, FBSNNs are nonuniquely determined by their I-O maps, up to a finite group of symmetries, and that every system inside this subclass is minimal. Finally, a new class of FNNs, namely, flexible bipolar radial basis neural networks (FBRBNNs) is introduced. It is proven that as in the case of classical radial basis neural networks (CRBNNs), feedforward SHL FBRBNNs are universal approximators.

UR - http://www.scopus.com/inward/record.url?scp=33847178732&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33847178732&partnerID=8YFLogxK

U2 - 10.1109/CDC.2005.1582547

DO - 10.1109/CDC.2005.1582547

M3 - Conference contribution

AN - SCOPUS:33847178732

SN - 0780395689

SN - 9780780395688

VL - 2005

SP - 2554

EP - 2561

BT - Proceedings of the 44th IEEE Conference on Decision and Control, and the European Control Conference, CDC-ECC '05

ER -