Clickable augmented documents

Sandy Martedi, Hideaki Uchiyama, Hideo Saito

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

This paper presents an Augmented Reality (AR) system for physical text documents that enable users to click a document. In the system, we track the relative pose between a camera and a document to overlay some virtual contents on the document continuously. In addition, we compute the trajectory of a fingertip based on skin color detection for clicking interaction. By merging a document tracking and an interaction technique, we have developed a novel tangible document system. As an application, we develop an AR dictionary system that overlays the meaning and explanation of words by clicking on a document. In the experiment part, we present the accuracy of the clicking interaction and the robustness of our document tracking method against the occlusion.

Original languageEnglish
Title of host publication2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010
Pages162-166
Number of pages5
DOIs
Publication statusPublished - 2010
Event2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010 - Saint Malo, France
Duration: 2010 Oct 42010 Oct 6

Other

Other2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010
CountryFrance
CitySaint Malo
Period10/10/410/10/6

Fingerprint

Augmented reality
Glossaries
Merging
Skin
Cameras
Trajectories
Color
Experiments

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Human-Computer Interaction
  • Signal Processing

Cite this

Martedi, S., Uchiyama, H., & Saito, H. (2010). Clickable augmented documents. In 2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010 (pp. 162-166). [5662012] https://doi.org/10.1109/MMSP.2010.5662012

Clickable augmented documents. / Martedi, Sandy; Uchiyama, Hideaki; Saito, Hideo.

2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010. 2010. p. 162-166 5662012.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Martedi, S, Uchiyama, H & Saito, H 2010, Clickable augmented documents. in 2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010., 5662012, pp. 162-166, 2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010, Saint Malo, France, 10/10/4. https://doi.org/10.1109/MMSP.2010.5662012
Martedi S, Uchiyama H, Saito H. Clickable augmented documents. In 2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010. 2010. p. 162-166. 5662012 https://doi.org/10.1109/MMSP.2010.5662012
Martedi, Sandy ; Uchiyama, Hideaki ; Saito, Hideo. / Clickable augmented documents. 2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010. 2010. pp. 162-166
@inproceedings{66f834d0f0e54fc9b378def3dbac1416,
title = "Clickable augmented documents",
abstract = "This paper presents an Augmented Reality (AR) system for physical text documents that enable users to click a document. In the system, we track the relative pose between a camera and a document to overlay some virtual contents on the document continuously. In addition, we compute the trajectory of a fingertip based on skin color detection for clicking interaction. By merging a document tracking and an interaction technique, we have developed a novel tangible document system. As an application, we develop an AR dictionary system that overlays the meaning and explanation of words by clicking on a document. In the experiment part, we present the accuracy of the clicking interaction and the robustness of our document tracking method against the occlusion.",
author = "Sandy Martedi and Hideaki Uchiyama and Hideo Saito",
year = "2010",
doi = "10.1109/MMSP.2010.5662012",
language = "English",
isbn = "9781424481125",
pages = "162--166",
booktitle = "2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010",

}

TY - GEN

T1 - Clickable augmented documents

AU - Martedi, Sandy

AU - Uchiyama, Hideaki

AU - Saito, Hideo

PY - 2010

Y1 - 2010

N2 - This paper presents an Augmented Reality (AR) system for physical text documents that enable users to click a document. In the system, we track the relative pose between a camera and a document to overlay some virtual contents on the document continuously. In addition, we compute the trajectory of a fingertip based on skin color detection for clicking interaction. By merging a document tracking and an interaction technique, we have developed a novel tangible document system. As an application, we develop an AR dictionary system that overlays the meaning and explanation of words by clicking on a document. In the experiment part, we present the accuracy of the clicking interaction and the robustness of our document tracking method against the occlusion.

AB - This paper presents an Augmented Reality (AR) system for physical text documents that enable users to click a document. In the system, we track the relative pose between a camera and a document to overlay some virtual contents on the document continuously. In addition, we compute the trajectory of a fingertip based on skin color detection for clicking interaction. By merging a document tracking and an interaction technique, we have developed a novel tangible document system. As an application, we develop an AR dictionary system that overlays the meaning and explanation of words by clicking on a document. In the experiment part, we present the accuracy of the clicking interaction and the robustness of our document tracking method against the occlusion.

UR - http://www.scopus.com/inward/record.url?scp=78650906003&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78650906003&partnerID=8YFLogxK

U2 - 10.1109/MMSP.2010.5662012

DO - 10.1109/MMSP.2010.5662012

M3 - Conference contribution

AN - SCOPUS:78650906003

SN - 9781424481125

SP - 162

EP - 166

BT - 2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010

ER -