Efficient sublinear-regret algorithms for online sparse linear regression with limited observation

Shinji Ito, Daisuke Hatano, Hanna Sumita, Akihiro Yabe, Takuro Fukunaga, Naonori Kakimura, Ken Ichi Kawarabayashi

研究成果: Conference article

1 引用 (Scopus)

抜粋

Online sparse linear regression is the task of applying linear regression analysis to examples arriving sequentially subject to a resource constraint that a limited number of features of examples can be observed. Despite its importance in many practical applications, it has been recently shown that there is no polynomialtime sublinear-regret algorithm unless NP ⊆ BPP, and only an exponential-time sublinear-regret algorithm has been found. In this paper, we introduce mild assumptions to solve the problem. Under these assumptions, we present polynomialtime sublinear-regret algorithms for the online sparse linear regression. In addition, thorough experiments with publicly available data demonstrate that our algorithms outperform other known algorithms.

元の言語English
ページ(範囲)4100-4109
ページ数10
ジャーナルAdvances in Neural Information Processing Systems
2017-December
出版物ステータスPublished - 2017 1 1
イベント31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
継続期間: 2017 12 42017 12 9

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

フィンガープリント Efficient sublinear-regret algorithms for online sparse linear regression with limited observation' の研究トピックを掘り下げます。これらはともに一意のフィンガープリントを構成します。

  • これを引用

    Ito, S., Hatano, D., Sumita, H., Yabe, A., Fukunaga, T., Kakimura, N., & Kawarabayashi, K. I. (2017). Efficient sublinear-regret algorithms for online sparse linear regression with limited observation. Advances in Neural Information Processing Systems, 2017-December, 4100-4109.