Quantile Regression Based on Laplacian Manifold Regularizer with the Data Sparsity in l1 Spaces

HTML  XML Download Download as PDF (Size: 434KB)  PP. 786-802  
DOI: 10.4236/ojs.2017.75056    714 Downloads   1,364 Views  
Author(s)

ABSTRACT

In this paper, we consider the regularized learning schemes based on l1-regularizer and pinball loss in a data dependent hypothesis space. The target is the error analysis for the quantile regression learning. There is no regularized condition with the kernel function, excepting continuity and boundness. The graph-based semi-supervised algorithm leads to an extra error term called manifold error. Part of new error bounds and convergence rates are exactly derived with the techniques consisting of l1-empirical covering number and boundness decomposition.

Share and Cite:

Feng, R. , Chen, S. and Rong, L. (2017) Quantile Regression Based on Laplacian Manifold Regularizer with the Data Sparsity in l1 Spaces. Open Journal of Statistics, 7, 786-802. doi: 10.4236/ojs.2017.75056.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.