Share This Article:

Penalized Flexible Bayesian Quantile Regression

Full-Text HTML Download Download as PDF (Size:527KB) PP. 2155-2168
DOI: 10.4236/am.2012.312A296    3,960 Downloads   6,886 Views Citations


The selection of predictors plays a crucial role in building a multiple regression model. Indeed, the choice of a suitable subset of predictors can help to improve prediction accuracy and interpretation. In this paper, we propose a flexible Bayesian Lasso and adaptive Lasso quantile regression by introducing a hierarchical model framework approach to enable exact inference and shrinkage of an unimportant coefficient to zero. The error distribution is assumed to be an infinite mixture of Gaussian densities. We have theoretically investigated and numerically compared our proposed methods with Flexible Bayesian quantile regression (FBQR), Lasso quantile regression (LQR) and quantile regression (QR) methods. Simulations and real data studies are conducted under different settings to assess the performance of the proposed methods. The proposed methods perform well in comparison to the other methods in terms of median mean squared error, mean and variance of the absolute correlation criterions. We believe that the proposed methods are useful practically.

Cite this paper

A. Alkenani, R. Alhamzawi and K. Yu, "Penalized Flexible Bayesian Quantile Regression," Applied Mathematics, Vol. 3 No. 12A, 2012, pp. 2155-2168. doi: 10.4236/am.2012.312A296.

Copyright © 2019 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.