Efficient sublinear-regret algorithms for online sparse linear regression with limited observation

Shinji Ito, Daisuke Hatano, Hanna Sumita, Akihiro Yabe, Takuro Fukunaga, Naonori Kakimura, Ken Ichi Kawarabayashi

研究成果: Conference article査読

1 被引用数 (Scopus)

抄録

Online sparse linear regression is the task of applying linear regression analysis to examples arriving sequentially subject to a resource constraint that a limited number of features of examples can be observed. Despite its importance in many practical applications, it has been recently shown that there is no polynomialtime sublinear-regret algorithm unless NP ⊆ BPP, and only an exponential-time sublinear-regret algorithm has been found. In this paper, we introduce mild assumptions to solve the problem. Under these assumptions, we present polynomialtime sublinear-regret algorithms for the online sparse linear regression. In addition, thorough experiments with publicly available data demonstrate that our algorithms outperform other known algorithms.

本文言語English
ページ(範囲)4100-4109
ページ数10
ジャーナルAdvances in Neural Information Processing Systems
2017-December
出版ステータスPublished - 2017
イベント31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
継続期間: 2017 12 42017 12 9

ASJC Scopus subject areas

  • コンピュータ ネットワークおよび通信
  • 情報システム
  • 信号処理

フィンガープリント

「Efficient sublinear-regret algorithms for online sparse linear regression with limited observation」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル