TY - JOUR

T1 - Positive-definite modification of a covariance matrix by minimizing the matrix ℓ∞ norm with applications to portfolio optimization

AU - Cho, Seonghun

AU - Katayama, Shota

AU - Lim, Johan

AU - Choi, Young Geun

N1 - Funding Information:
We are grateful to the associate editor and two reviewers for many valuable comments. S. Katayama?s research is supported by JSPS KAKENHI Grant No. 18K18009, and J. Lim and Y-G. Choi?s research is supported by the National Research Foundation of Korea (NRF-2017R1A2B2012264 and NRF-2020R1G1A1A01006229).
Publisher Copyright:
© 2021, Springer-Verlag GmbH Germany, part of Springer Nature.

PY - 2021

Y1 - 2021

N2 - The covariance matrix, which should be estimated from the data, plays an important role in many multivariate procedures, and its positive definiteness (PDness) is essential for the validity of the procedures. Recently, many regularized estimators have been proposed and shown to be consistent in estimating the true matrix and its support under various structural assumptions on the true covariance matrix. However, they are often not PD. In this paper, we propose a simple modification to make a regularized covariance matrix be PD while preserving its support and the convergence rate. We focus on the matrix ℓ∞ norm error in covariance matrix estimation because it could allow us to bound the error in the downstream multivariate procedure relying on it. Our proposal in this paper is an extension of the fixed support positive-definite (FSPD) modification by Choi et al. (2019) from spectral and Frobenius norms to the matrix ℓ∞ norm. Like the original FSPD, we consider a convex combination between the initial estimator (the regularized covariance matrix without PDness) and a given form of the diagonal matrix minimize the ℓ∞ distance between the initial estimator and the convex combination, and find a closed-form expression for the modification. We apply the procedure to the minimum variance portfolio (MVP) optimization problem and show that the vector ℓ∞ error in the estimation of the optimal portfolio weight is bounded by the matrix ℓ∞ error of the plug-in covariance matrix estimator. We illustrate the MVP results with S&P 500 daily returns data from January 1978 to December 2014.

AB - The covariance matrix, which should be estimated from the data, plays an important role in many multivariate procedures, and its positive definiteness (PDness) is essential for the validity of the procedures. Recently, many regularized estimators have been proposed and shown to be consistent in estimating the true matrix and its support under various structural assumptions on the true covariance matrix. However, they are often not PD. In this paper, we propose a simple modification to make a regularized covariance matrix be PD while preserving its support and the convergence rate. We focus on the matrix ℓ∞ norm error in covariance matrix estimation because it could allow us to bound the error in the downstream multivariate procedure relying on it. Our proposal in this paper is an extension of the fixed support positive-definite (FSPD) modification by Choi et al. (2019) from spectral and Frobenius norms to the matrix ℓ∞ norm. Like the original FSPD, we consider a convex combination between the initial estimator (the regularized covariance matrix without PDness) and a given form of the diagonal matrix minimize the ℓ∞ distance between the initial estimator and the convex combination, and find a closed-form expression for the modification. We apply the procedure to the minimum variance portfolio (MVP) optimization problem and show that the vector ℓ∞ error in the estimation of the optimal portfolio weight is bounded by the matrix ℓ∞ error of the plug-in covariance matrix estimator. We illustrate the MVP results with S&P 500 daily returns data from January 1978 to December 2014.

KW - High-dimensional covariance matrix

KW - Linear shrinkage

KW - Matrix ℓ norm

KW - Minimum variance portfolio

KW - Positive definiteness

KW - Regularized covariance matrix estimator

UR - http://www.scopus.com/inward/record.url?scp=85102393009&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85102393009&partnerID=8YFLogxK

U2 - 10.1007/s10182-021-00396-7

DO - 10.1007/s10182-021-00396-7

M3 - Article

AN - SCOPUS:85102393009

VL - 105

SP - 601

EP - 627

JO - AStA Advances in Statistical Analysis

JF - AStA Advances in Statistical Analysis

SN - 1863-8171

IS - 4

ER -