Efficient sublinear-regret algorithms for online sparse linear regression with limited observation

Shinji Ito, Daisuke Hatano, Hanna Sumita, Akihiro Yabe, Takuro Fukunaga, Naonori Kakimura, Ken Ichi Kawarabayashi

Research output: Contribution to journalConference article

1 Citation (Scopus)

Abstract

Online sparse linear regression is the task of applying linear regression analysis to examples arriving sequentially subject to a resource constraint that a limited number of features of examples can be observed. Despite its importance in many practical applications, it has been recently shown that there is no polynomialtime sublinear-regret algorithm unless NP ⊆ BPP, and only an exponential-time sublinear-regret algorithm has been found. In this paper, we introduce mild assumptions to solve the problem. Under these assumptions, we present polynomialtime sublinear-regret algorithms for the online sparse linear regression. In addition, thorough experiments with publicly available data demonstrate that our algorithms outperform other known algorithms.

Original languageEnglish
Pages (from-to)4100-4109
Number of pages10
JournalAdvances in Neural Information Processing Systems
Volume2017-December
Publication statusPublished - 2017 Jan 1
Event31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
Duration: 2017 Dec 42017 Dec 9

Fingerprint

Linear regression
Regression analysis
Experiments

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Ito, S., Hatano, D., Sumita, H., Yabe, A., Fukunaga, T., Kakimura, N., & Kawarabayashi, K. I. (2017). Efficient sublinear-regret algorithms for online sparse linear regression with limited observation. Advances in Neural Information Processing Systems, 2017-December, 4100-4109.

Efficient sublinear-regret algorithms for online sparse linear regression with limited observation. / Ito, Shinji; Hatano, Daisuke; Sumita, Hanna; Yabe, Akihiro; Fukunaga, Takuro; Kakimura, Naonori; Kawarabayashi, Ken Ichi.

In: Advances in Neural Information Processing Systems, Vol. 2017-December, 01.01.2017, p. 4100-4109.

Research output: Contribution to journalConference article

Ito, S, Hatano, D, Sumita, H, Yabe, A, Fukunaga, T, Kakimura, N & Kawarabayashi, KI 2017, 'Efficient sublinear-regret algorithms for online sparse linear regression with limited observation', Advances in Neural Information Processing Systems, vol. 2017-December, pp. 4100-4109.
Ito, Shinji ; Hatano, Daisuke ; Sumita, Hanna ; Yabe, Akihiro ; Fukunaga, Takuro ; Kakimura, Naonori ; Kawarabayashi, Ken Ichi. / Efficient sublinear-regret algorithms for online sparse linear regression with limited observation. In: Advances in Neural Information Processing Systems. 2017 ; Vol. 2017-December. pp. 4100-4109.
@article{6e9245abfd28458dbbef7d5bd9b002da,
title = "Efficient sublinear-regret algorithms for online sparse linear regression with limited observation",
abstract = "Online sparse linear regression is the task of applying linear regression analysis to examples arriving sequentially subject to a resource constraint that a limited number of features of examples can be observed. Despite its importance in many practical applications, it has been recently shown that there is no polynomialtime sublinear-regret algorithm unless NP ⊆ BPP, and only an exponential-time sublinear-regret algorithm has been found. In this paper, we introduce mild assumptions to solve the problem. Under these assumptions, we present polynomialtime sublinear-regret algorithms for the online sparse linear regression. In addition, thorough experiments with publicly available data demonstrate that our algorithms outperform other known algorithms.",
author = "Shinji Ito and Daisuke Hatano and Hanna Sumita and Akihiro Yabe and Takuro Fukunaga and Naonori Kakimura and Kawarabayashi, {Ken Ichi}",
year = "2017",
month = "1",
day = "1",
language = "English",
volume = "2017-December",
pages = "4100--4109",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - Efficient sublinear-regret algorithms for online sparse linear regression with limited observation

AU - Ito, Shinji

AU - Hatano, Daisuke

AU - Sumita, Hanna

AU - Yabe, Akihiro

AU - Fukunaga, Takuro

AU - Kakimura, Naonori

AU - Kawarabayashi, Ken Ichi

PY - 2017/1/1

Y1 - 2017/1/1

N2 - Online sparse linear regression is the task of applying linear regression analysis to examples arriving sequentially subject to a resource constraint that a limited number of features of examples can be observed. Despite its importance in many practical applications, it has been recently shown that there is no polynomialtime sublinear-regret algorithm unless NP ⊆ BPP, and only an exponential-time sublinear-regret algorithm has been found. In this paper, we introduce mild assumptions to solve the problem. Under these assumptions, we present polynomialtime sublinear-regret algorithms for the online sparse linear regression. In addition, thorough experiments with publicly available data demonstrate that our algorithms outperform other known algorithms.

AB - Online sparse linear regression is the task of applying linear regression analysis to examples arriving sequentially subject to a resource constraint that a limited number of features of examples can be observed. Despite its importance in many practical applications, it has been recently shown that there is no polynomialtime sublinear-regret algorithm unless NP ⊆ BPP, and only an exponential-time sublinear-regret algorithm has been found. In this paper, we introduce mild assumptions to solve the problem. Under these assumptions, we present polynomialtime sublinear-regret algorithms for the online sparse linear regression. In addition, thorough experiments with publicly available data demonstrate that our algorithms outperform other known algorithms.

UR - http://www.scopus.com/inward/record.url?scp=85047014500&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85047014500&partnerID=8YFLogxK

M3 - Conference article

VL - 2017-December

SP - 4100

EP - 4109

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -