Help me! Valuing and visualizing participatory sensing tasks with physical sensors

Mina Sakamura, Takuro Yonezawa, Jin Nakazawa, Kazunori Takashio, Hideyuki Tokuda

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Recent progress of mobile devices such as smartphones enables human to leverage their perception ability as a part of sensing framework. This sensing framework, so called participatory sensing, distributes various sensing tasks (e.g., whether report, waiting time in a queue, traffic conditions etc.) to possible participants. Then, participants can select and achieve the sensing tasks. However, in the coming future with the rapid growth of participatory sensing and increasing number of sensing tasks, it must be very hard for users to choose appropriate sensing tasks around them. To solve this problem, we propose a system called Help Me!, which can value and visualize importance of sensing tasks by quantifying them in cooperation with physical sensors. Since Help Me! system provides objective index for sensing tasks, it enhances opportunity for users to participate to sensing tasks. We designed and implemented Help Me! system as an integrated architecture of physical sensors and participatory sensors. Through initial in-lab experiment, we confirmed that Help Me! system can enhance opportunity to participate to sensing tasks for users.

Original languageEnglish
Title of host publicationACM International Conference Proceeding Series
PublisherAssociation for Computing Machinery
ISBN (Print)9781450327473
DOIs
Publication statusPublished - 2014
Event2014 International Workshop on Web Intelligence and Smart Sensing, IWWISS 2014 - Saint Etienne, France
Duration: 2014 Sep 12014 Sep 2

Other

Other2014 International Workshop on Web Intelligence and Smart Sensing, IWWISS 2014
CountryFrance
CitySaint Etienne
Period14/9/114/9/2

Fingerprint

Sensors
Smartphones
Mobile devices
Experiments

Keywords

  • integrated sensing architecture
  • mobile sensing
  • Participatory sensing
  • sensor networks
  • valuing information
  • visualization of information
  • XMPP

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Networks and Communications
  • Computer Vision and Pattern Recognition
  • Software

Cite this

Sakamura, M., Yonezawa, T., Nakazawa, J., Takashio, K., & Tokuda, H. (2014). Help me! Valuing and visualizing participatory sensing tasks with physical sensors. In ACM International Conference Proceeding Series Association for Computing Machinery. https://doi.org/10.1145/2637064.2637095

Help me! Valuing and visualizing participatory sensing tasks with physical sensors. / Sakamura, Mina; Yonezawa, Takuro; Nakazawa, Jin; Takashio, Kazunori; Tokuda, Hideyuki.

ACM International Conference Proceeding Series. Association for Computing Machinery, 2014.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sakamura, M, Yonezawa, T, Nakazawa, J, Takashio, K & Tokuda, H 2014, Help me! Valuing and visualizing participatory sensing tasks with physical sensors. in ACM International Conference Proceeding Series. Association for Computing Machinery, 2014 International Workshop on Web Intelligence and Smart Sensing, IWWISS 2014, Saint Etienne, France, 14/9/1. https://doi.org/10.1145/2637064.2637095
Sakamura M, Yonezawa T, Nakazawa J, Takashio K, Tokuda H. Help me! Valuing and visualizing participatory sensing tasks with physical sensors. In ACM International Conference Proceeding Series. Association for Computing Machinery. 2014 https://doi.org/10.1145/2637064.2637095
Sakamura, Mina ; Yonezawa, Takuro ; Nakazawa, Jin ; Takashio, Kazunori ; Tokuda, Hideyuki. / Help me! Valuing and visualizing participatory sensing tasks with physical sensors. ACM International Conference Proceeding Series. Association for Computing Machinery, 2014.
@inproceedings{d62505f6ec7a4ef79f24fdd41a4d35bd,
title = "Help me!: Valuing and visualizing participatory sensing tasks with physical sensors",
abstract = "Recent progress of mobile devices such as smartphones enables human to leverage their perception ability as a part of sensing framework. This sensing framework, so called participatory sensing, distributes various sensing tasks (e.g., whether report, waiting time in a queue, traffic conditions etc.) to possible participants. Then, participants can select and achieve the sensing tasks. However, in the coming future with the rapid growth of participatory sensing and increasing number of sensing tasks, it must be very hard for users to choose appropriate sensing tasks around them. To solve this problem, we propose a system called Help Me!, which can value and visualize importance of sensing tasks by quantifying them in cooperation with physical sensors. Since Help Me! system provides objective index for sensing tasks, it enhances opportunity for users to participate to sensing tasks. We designed and implemented Help Me! system as an integrated architecture of physical sensors and participatory sensors. Through initial in-lab experiment, we confirmed that Help Me! system can enhance opportunity to participate to sensing tasks for users.",
keywords = "integrated sensing architecture, mobile sensing, Participatory sensing, sensor networks, valuing information, visualization of information, XMPP",
author = "Mina Sakamura and Takuro Yonezawa and Jin Nakazawa and Kazunori Takashio and Hideyuki Tokuda",
year = "2014",
doi = "10.1145/2637064.2637095",
language = "English",
isbn = "9781450327473",
booktitle = "ACM International Conference Proceeding Series",
publisher = "Association for Computing Machinery",

}

TY - GEN

T1 - Help me!

T2 - Valuing and visualizing participatory sensing tasks with physical sensors

AU - Sakamura, Mina

AU - Yonezawa, Takuro

AU - Nakazawa, Jin

AU - Takashio, Kazunori

AU - Tokuda, Hideyuki

PY - 2014

Y1 - 2014

N2 - Recent progress of mobile devices such as smartphones enables human to leverage their perception ability as a part of sensing framework. This sensing framework, so called participatory sensing, distributes various sensing tasks (e.g., whether report, waiting time in a queue, traffic conditions etc.) to possible participants. Then, participants can select and achieve the sensing tasks. However, in the coming future with the rapid growth of participatory sensing and increasing number of sensing tasks, it must be very hard for users to choose appropriate sensing tasks around them. To solve this problem, we propose a system called Help Me!, which can value and visualize importance of sensing tasks by quantifying them in cooperation with physical sensors. Since Help Me! system provides objective index for sensing tasks, it enhances opportunity for users to participate to sensing tasks. We designed and implemented Help Me! system as an integrated architecture of physical sensors and participatory sensors. Through initial in-lab experiment, we confirmed that Help Me! system can enhance opportunity to participate to sensing tasks for users.

AB - Recent progress of mobile devices such as smartphones enables human to leverage their perception ability as a part of sensing framework. This sensing framework, so called participatory sensing, distributes various sensing tasks (e.g., whether report, waiting time in a queue, traffic conditions etc.) to possible participants. Then, participants can select and achieve the sensing tasks. However, in the coming future with the rapid growth of participatory sensing and increasing number of sensing tasks, it must be very hard for users to choose appropriate sensing tasks around them. To solve this problem, we propose a system called Help Me!, which can value and visualize importance of sensing tasks by quantifying them in cooperation with physical sensors. Since Help Me! system provides objective index for sensing tasks, it enhances opportunity for users to participate to sensing tasks. We designed and implemented Help Me! system as an integrated architecture of physical sensors and participatory sensors. Through initial in-lab experiment, we confirmed that Help Me! system can enhance opportunity to participate to sensing tasks for users.

KW - integrated sensing architecture

KW - mobile sensing

KW - Participatory sensing

KW - sensor networks

KW - valuing information

KW - visualization of information

KW - XMPP

UR - http://www.scopus.com/inward/record.url?scp=84907071539&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84907071539&partnerID=8YFLogxK

U2 - 10.1145/2637064.2637095

DO - 10.1145/2637064.2637095

M3 - Conference contribution

AN - SCOPUS:84907071539

SN - 9781450327473

BT - ACM International Conference Proceeding Series

PB - Association for Computing Machinery

ER -