I see you

How to improve wearable activity recognition by leveraging information from environmental cameras

Gernot Bahle, Paul Lukowicz, Kai Steven Kunze, Koichi Kise

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Citations (Scopus)

Abstract

In this paper we investigate how vision based devices (cameras or the Kinect controller) that happen to be in the users' environment can be used to improve and fine tune on body sensor systems for activity recognition. Thus we imagine a user with his on body activity recognition system passing through a space with a video camera (or a Kinect), picking up some information, and using it to improve his system. The general idea is to correlate an anonymous 'stick figure' like description of the motion of a user's body parts provided by the vision system with the sensor signals as a means of analyzing the sensors' properties. In the paper we for example demonstrate how such a correlation can be used to determine, without the need to train any classifiers, on which body part a motion sensor is worn.

Original languageEnglish
Title of host publication2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013
Pages409-412
Number of pages4
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013 - San Diego, CA, United States
Duration: 2013 Mar 182013 Mar 22

Other

Other2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013
CountryUnited States
CitySan Diego, CA
Period13/3/1813/3/22

Fingerprint

Cameras
Sensors
Video cameras
Classifiers
Controllers

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Software

Cite this

Bahle, G., Lukowicz, P., Kunze, K. S., & Kise, K. (2013). I see you: How to improve wearable activity recognition by leveraging information from environmental cameras. In 2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013 (pp. 409-412). [6529528] https://doi.org/10.1109/PerComW.2013.6529528

I see you : How to improve wearable activity recognition by leveraging information from environmental cameras. / Bahle, Gernot; Lukowicz, Paul; Kunze, Kai Steven; Kise, Koichi.

2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013. 2013. p. 409-412 6529528.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Bahle, G, Lukowicz, P, Kunze, KS & Kise, K 2013, I see you: How to improve wearable activity recognition by leveraging information from environmental cameras. in 2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013., 6529528, pp. 409-412, 2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013, San Diego, CA, United States, 13/3/18. https://doi.org/10.1109/PerComW.2013.6529528
Bahle G, Lukowicz P, Kunze KS, Kise K. I see you: How to improve wearable activity recognition by leveraging information from environmental cameras. In 2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013. 2013. p. 409-412. 6529528 https://doi.org/10.1109/PerComW.2013.6529528
Bahle, Gernot ; Lukowicz, Paul ; Kunze, Kai Steven ; Kise, Koichi. / I see you : How to improve wearable activity recognition by leveraging information from environmental cameras. 2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013. 2013. pp. 409-412
@inproceedings{6eb294ffb267435c8ebbce1970dad75c,
title = "I see you: How to improve wearable activity recognition by leveraging information from environmental cameras",
abstract = "In this paper we investigate how vision based devices (cameras or the Kinect controller) that happen to be in the users' environment can be used to improve and fine tune on body sensor systems for activity recognition. Thus we imagine a user with his on body activity recognition system passing through a space with a video camera (or a Kinect), picking up some information, and using it to improve his system. The general idea is to correlate an anonymous 'stick figure' like description of the motion of a user's body parts provided by the vision system with the sensor signals as a means of analyzing the sensors' properties. In the paper we for example demonstrate how such a correlation can be used to determine, without the need to train any classifiers, on which body part a motion sensor is worn.",
author = "Gernot Bahle and Paul Lukowicz and Kunze, {Kai Steven} and Koichi Kise",
year = "2013",
doi = "10.1109/PerComW.2013.6529528",
language = "English",
isbn = "9781467350778",
pages = "409--412",
booktitle = "2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013",

}

TY - GEN

T1 - I see you

T2 - How to improve wearable activity recognition by leveraging information from environmental cameras

AU - Bahle, Gernot

AU - Lukowicz, Paul

AU - Kunze, Kai Steven

AU - Kise, Koichi

PY - 2013

Y1 - 2013

N2 - In this paper we investigate how vision based devices (cameras or the Kinect controller) that happen to be in the users' environment can be used to improve and fine tune on body sensor systems for activity recognition. Thus we imagine a user with his on body activity recognition system passing through a space with a video camera (or a Kinect), picking up some information, and using it to improve his system. The general idea is to correlate an anonymous 'stick figure' like description of the motion of a user's body parts provided by the vision system with the sensor signals as a means of analyzing the sensors' properties. In the paper we for example demonstrate how such a correlation can be used to determine, without the need to train any classifiers, on which body part a motion sensor is worn.

AB - In this paper we investigate how vision based devices (cameras or the Kinect controller) that happen to be in the users' environment can be used to improve and fine tune on body sensor systems for activity recognition. Thus we imagine a user with his on body activity recognition system passing through a space with a video camera (or a Kinect), picking up some information, and using it to improve his system. The general idea is to correlate an anonymous 'stick figure' like description of the motion of a user's body parts provided by the vision system with the sensor signals as a means of analyzing the sensors' properties. In the paper we for example demonstrate how such a correlation can be used to determine, without the need to train any classifiers, on which body part a motion sensor is worn.

UR - http://www.scopus.com/inward/record.url?scp=84881499990&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84881499990&partnerID=8YFLogxK

U2 - 10.1109/PerComW.2013.6529528

DO - 10.1109/PerComW.2013.6529528

M3 - Conference contribution

SN - 9781467350778

SP - 409

EP - 412

BT - 2013 IEEE International Conference on Pervasive Computing and Communications Workshops, PerCom Workshops 2013

ER -