Damageless image hashing using neural network

Kensuke Naoe, Yoshiyasu Takefuji

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

In this paper, we present a new key generation model for image hashing using neural network, which does not embed any data into the content but is able to extract meaningful data from target image. This model trains artificial neural network to assign predefined code and uses this trained artificial neural network weight and the coordinates of the selected feature sub blocks of target image as keys to extract the predefined code. In this model, the observed output signal from the trained neural network is used as image hash value which distinguishes the target image from other images. The proposed method contributes to secure image hashing for content identification without damaging or losing any detailed data of visual images. The proposed method realizes an application for image authentication, image similarity comparison, verification of image integrity and copyright protection of multimedia contents.

Original languageEnglish
Title of host publicationProceedings of the 2010 International Conference of Soft Computing and Pattern Recognition, SoCPaR 2010
Pages442-447
Number of pages6
DOIs
Publication statusPublished - 2010 Dec 1
Event2010 International Conference of Soft Computing and Pattern Recognition, SoCPaR 2010 - Cergy-Pontoise, France
Duration: 2010 Dec 72010 Dec 10

Publication series

NameProceedings of the 2010 International Conference of Soft Computing and Pattern Recognition, SoCPaR 2010

Other

Other2010 International Conference of Soft Computing and Pattern Recognition, SoCPaR 2010
CountryFrance
CityCergy-Pontoise
Period10/12/710/12/10

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Computer Science Applications
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Damageless image hashing using neural network'. Together they form a unique fingerprint.

Cite this