Abstract
In this paper, we propose a Laplacian minimax probability machine, which is a semi-supervised version of minimax probability machine based on the manifold regularization framework. We also show that the proposed method can be kernelized on the basis of a theorem similar to the representer theorem for non-linear cases. Experiments confirm that the proposed methods achieve competitive results, as compared to existing graph-based learning methods such as the Laplacian support vector machine and the Laplacian regularized least square, for publicly available datasets from the UCI machine learning repository.
Original language | English |
---|---|
Pages (from-to) | 192-200 |
Number of pages | 9 |
Journal | Pattern Recognition Letters |
Volume | 37 |
Issue number | 1 |
DOIs | |
Publication status | Published - 2014 Feb 1 |
Keywords
- Laplacian RLS
- Laplacian SVM
- Manifold regularization
- Minimax probability machine
- Semi-supervised learning
ASJC Scopus subject areas
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Artificial Intelligence