Sparse and robust linear regression: An optimization algorithm and its statistical properties

Shota Katayama, Hironori Fujisawa

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

This paper studies sparse linear regression analysis with outliers in the responses. A parameter vector for modeling outliers is added to the standard linear regression model and then the sparse estimation problem for both coefficients and outliers is considered. The 1 penalty is imposed for the coefficients, while various penalties including redescending type penalties are for the outliers. To solve the sparse estimation problem, we introduce an optimization algorithm. Under some conditions, we show the algorithmic and statistical convergence property for the coefficients obtained by the algorithm. Moreover, it is shown that the algorithm can recover the true support of the coefficients with probability going to one.

Original languageEnglish
Pages (from-to)1243-1264
Number of pages22
JournalStatistica Sinica
Volume27
Issue number3
DOIs
Publication statusPublished - 2017 Jul 1
Externally publishedYes

Fingerprint

Linear regression
Statistical property
Outlier
Optimization Algorithm
Penalty
Coefficient
Statistical Convergence
Linear Regression Model
Regression Analysis
Convergence Properties
Outliers
Coefficients
Modeling

Keywords

  • Algorithmic and statistical convergence
  • Robust estimation
  • Sparse linear regression
  • Support recovery

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this

Sparse and robust linear regression : An optimization algorithm and its statistical properties. / Katayama, Shota; Fujisawa, Hironori.

In: Statistica Sinica, Vol. 27, No. 3, 01.07.2017, p. 1243-1264.

Research output: Contribution to journalArticle

@article{af86f453b62a42fc82966300d1ef2a5f,
title = "Sparse and robust linear regression: An optimization algorithm and its statistical properties",
abstract = "This paper studies sparse linear regression analysis with outliers in the responses. A parameter vector for modeling outliers is added to the standard linear regression model and then the sparse estimation problem for both coefficients and outliers is considered. The 1 penalty is imposed for the coefficients, while various penalties including redescending type penalties are for the outliers. To solve the sparse estimation problem, we introduce an optimization algorithm. Under some conditions, we show the algorithmic and statistical convergence property for the coefficients obtained by the algorithm. Moreover, it is shown that the algorithm can recover the true support of the coefficients with probability going to one.",
keywords = "Algorithmic and statistical convergence, Robust estimation, Sparse linear regression, Support recovery",
author = "Shota Katayama and Hironori Fujisawa",
year = "2017",
month = "7",
day = "1",
doi = "10.5705/ss.202015.0179",
language = "English",
volume = "27",
pages = "1243--1264",
journal = "Statistica Sinica",
issn = "1017-0405",
publisher = "Institute of Statistical Science",
number = "3",

}

TY - JOUR

T1 - Sparse and robust linear regression

T2 - An optimization algorithm and its statistical properties

AU - Katayama, Shota

AU - Fujisawa, Hironori

PY - 2017/7/1

Y1 - 2017/7/1

N2 - This paper studies sparse linear regression analysis with outliers in the responses. A parameter vector for modeling outliers is added to the standard linear regression model and then the sparse estimation problem for both coefficients and outliers is considered. The 1 penalty is imposed for the coefficients, while various penalties including redescending type penalties are for the outliers. To solve the sparse estimation problem, we introduce an optimization algorithm. Under some conditions, we show the algorithmic and statistical convergence property for the coefficients obtained by the algorithm. Moreover, it is shown that the algorithm can recover the true support of the coefficients with probability going to one.

AB - This paper studies sparse linear regression analysis with outliers in the responses. A parameter vector for modeling outliers is added to the standard linear regression model and then the sparse estimation problem for both coefficients and outliers is considered. The 1 penalty is imposed for the coefficients, while various penalties including redescending type penalties are for the outliers. To solve the sparse estimation problem, we introduce an optimization algorithm. Under some conditions, we show the algorithmic and statistical convergence property for the coefficients obtained by the algorithm. Moreover, it is shown that the algorithm can recover the true support of the coefficients with probability going to one.

KW - Algorithmic and statistical convergence

KW - Robust estimation

KW - Sparse linear regression

KW - Support recovery

UR - http://www.scopus.com/inward/record.url?scp=85020540416&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85020540416&partnerID=8YFLogxK

U2 - 10.5705/ss.202015.0179

DO - 10.5705/ss.202015.0179

M3 - Article

AN - SCOPUS:85020540416

VL - 27

SP - 1243

EP - 1264

JO - Statistica Sinica

JF - Statistica Sinica

SN - 1017-0405

IS - 3

ER -