Efficient sublinear-regret algorithms for online sparse linear regression with limited observation

Shinji Ito, Daisuke Hatano, Hanna Sumita, Akihiro Yabe, Takuro Fukunaga, Naonori Kakimura, Ken Ichi Kawarabayashi

Research output: Contribution to journalConference article

1 Citation (Scopus)

Abstract

Online sparse linear regression is the task of applying linear regression analysis to examples arriving sequentially subject to a resource constraint that a limited number of features of examples can be observed. Despite its importance in many practical applications, it has been recently shown that there is no polynomialtime sublinear-regret algorithm unless NP ⊆ BPP, and only an exponential-time sublinear-regret algorithm has been found. In this paper, we introduce mild assumptions to solve the problem. Under these assumptions, we present polynomialtime sublinear-regret algorithms for the online sparse linear regression. In addition, thorough experiments with publicly available data demonstrate that our algorithms outperform other known algorithms.

Original languageEnglish
Pages (from-to)4100-4109
Number of pages10
JournalAdvances in Neural Information Processing Systems
Volume2017-December
Publication statusPublished - 2017 Jan 1
Event31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
Duration: 2017 Dec 42017 Dec 9

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint Dive into the research topics of 'Efficient sublinear-regret algorithms for online sparse linear regression with limited observation'. Together they form a unique fingerprint.

  • Cite this

    Ito, S., Hatano, D., Sumita, H., Yabe, A., Fukunaga, T., Kakimura, N., & Kawarabayashi, K. I. (2017). Efficient sublinear-regret algorithms for online sparse linear regression with limited observation. Advances in Neural Information Processing Systems, 2017-December, 4100-4109.