A new method for inverting feedforward neural networks

Yoshio Araki, Toshifumi Ohki, Daniel Citterio, Masafumi Hagiwara, Koji Suzuki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

In this paper, we propose a new method for inverting feedforward neural networks. Inversion of neural networks means to find the inputs which produce given outputs. In general, this is an ill-posed problem whose solution isn't unique. Inversion using iterative optimization method (for example gradient descent, quasi-Newton method) is useful to this problem and it is called "iterative inversion". We propose a new iterative inversion using a Bottleneck Neural Network with Hidden layer's input units (BNNH), which we design on the basis of Bottleneck Neural Network (BNN). Compressing input space by BNNH, we reduce the dimension of search space, or input space to be searched with iterative inversion. With reduction of the search space's dimension, performance about computation time and accuracy is expected to become better. In experiments, the proposed method is applied to some examples. These results show the effectively of the proposed method.

Original languageEnglish
Title of host publicationProceedings of the IEEE International Conference on Systems, Man and Cybernetics
Pages1612-1617
Number of pages6
Volume2
Publication statusPublished - 2003
EventSystem Security and Assurance - Washington, DC, United States
Duration: 2003 Oct 52003 Oct 8

Other

OtherSystem Security and Assurance
CountryUnited States
CityWashington, DC
Period03/10/503/10/8

Fingerprint

Feedforward neural networks
Neural networks
Newton-Raphson method
Experiments

Keywords

  • Bottleneck neural networks
  • Ill-posed problem
  • Inverse problem
  • Iterative optimization method

ASJC Scopus subject areas

  • Hardware and Architecture
  • Control and Systems Engineering

Cite this

Araki, Y., Ohki, T., Citterio, D., Hagiwara, M., & Suzuki, K. (2003). A new method for inverting feedforward neural networks. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (Vol. 2, pp. 1612-1617)

A new method for inverting feedforward neural networks. / Araki, Yoshio; Ohki, Toshifumi; Citterio, Daniel; Hagiwara, Masafumi; Suzuki, Koji.

Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. Vol. 2 2003. p. 1612-1617.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Araki, Y, Ohki, T, Citterio, D, Hagiwara, M & Suzuki, K 2003, A new method for inverting feedforward neural networks. in Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. vol. 2, pp. 1612-1617, System Security and Assurance, Washington, DC, United States, 03/10/5.
Araki Y, Ohki T, Citterio D, Hagiwara M, Suzuki K. A new method for inverting feedforward neural networks. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. Vol. 2. 2003. p. 1612-1617
Araki, Yoshio ; Ohki, Toshifumi ; Citterio, Daniel ; Hagiwara, Masafumi ; Suzuki, Koji. / A new method for inverting feedforward neural networks. Proceedings of the IEEE International Conference on Systems, Man and Cybernetics. Vol. 2 2003. pp. 1612-1617
@inproceedings{2995a0b96a5442bbb88972572e426bb0,
title = "A new method for inverting feedforward neural networks",
abstract = "In this paper, we propose a new method for inverting feedforward neural networks. Inversion of neural networks means to find the inputs which produce given outputs. In general, this is an ill-posed problem whose solution isn't unique. Inversion using iterative optimization method (for example gradient descent, quasi-Newton method) is useful to this problem and it is called {"}iterative inversion{"}. We propose a new iterative inversion using a Bottleneck Neural Network with Hidden layer's input units (BNNH), which we design on the basis of Bottleneck Neural Network (BNN). Compressing input space by BNNH, we reduce the dimension of search space, or input space to be searched with iterative inversion. With reduction of the search space's dimension, performance about computation time and accuracy is expected to become better. In experiments, the proposed method is applied to some examples. These results show the effectively of the proposed method.",
keywords = "Bottleneck neural networks, Ill-posed problem, Inverse problem, Iterative optimization method",
author = "Yoshio Araki and Toshifumi Ohki and Daniel Citterio and Masafumi Hagiwara and Koji Suzuki",
year = "2003",
language = "English",
volume = "2",
pages = "1612--1617",
booktitle = "Proceedings of the IEEE International Conference on Systems, Man and Cybernetics",

}

TY - GEN

T1 - A new method for inverting feedforward neural networks

AU - Araki, Yoshio

AU - Ohki, Toshifumi

AU - Citterio, Daniel

AU - Hagiwara, Masafumi

AU - Suzuki, Koji

PY - 2003

Y1 - 2003

N2 - In this paper, we propose a new method for inverting feedforward neural networks. Inversion of neural networks means to find the inputs which produce given outputs. In general, this is an ill-posed problem whose solution isn't unique. Inversion using iterative optimization method (for example gradient descent, quasi-Newton method) is useful to this problem and it is called "iterative inversion". We propose a new iterative inversion using a Bottleneck Neural Network with Hidden layer's input units (BNNH), which we design on the basis of Bottleneck Neural Network (BNN). Compressing input space by BNNH, we reduce the dimension of search space, or input space to be searched with iterative inversion. With reduction of the search space's dimension, performance about computation time and accuracy is expected to become better. In experiments, the proposed method is applied to some examples. These results show the effectively of the proposed method.

AB - In this paper, we propose a new method for inverting feedforward neural networks. Inversion of neural networks means to find the inputs which produce given outputs. In general, this is an ill-posed problem whose solution isn't unique. Inversion using iterative optimization method (for example gradient descent, quasi-Newton method) is useful to this problem and it is called "iterative inversion". We propose a new iterative inversion using a Bottleneck Neural Network with Hidden layer's input units (BNNH), which we design on the basis of Bottleneck Neural Network (BNN). Compressing input space by BNNH, we reduce the dimension of search space, or input space to be searched with iterative inversion. With reduction of the search space's dimension, performance about computation time and accuracy is expected to become better. In experiments, the proposed method is applied to some examples. These results show the effectively of the proposed method.

KW - Bottleneck neural networks

KW - Ill-posed problem

KW - Inverse problem

KW - Iterative optimization method

UR - http://www.scopus.com/inward/record.url?scp=0242660370&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0242660370&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:0242660370

VL - 2

SP - 1612

EP - 1617

BT - Proceedings of the IEEE International Conference on Systems, Man and Cybernetics

ER -