Smoothing Approximations for Least Squares Minimization with L1-Norm Regularization Functional

Main Article Content

Henrietta Nkansah, Francis Benyah, Henry Amankwah

Abstract

The paper considers the problem of least squares minimization with L1-norm regularization functional. It investigates various smoothing approximations for the L1-norm functional. It considers Quadratic, Sigmoid and Cubic Hermite functionals. A Tikhonov regularization is then applied to each of the resulting smooth least squares minimization problem. Results of numerical simulations for each smoothing approximation are presented. The results indicate that our regularization method is as good as any other non-smoothing method used in developed solvers.

Article Details

References

  1. C. Chen, O.L. Mangasarian, A class of smoothing functions for nonlinear and mixed complementarity problems, Comput. Optim. Appl. 5 (1996), 97-138.
  2. S.I. Lee, H. Lee, P. Abbeel, A.Y. Ng, Efficient L1 regularized logistic regression. https://www.aaai.org/Papers/AAAI/ 2006/AAAI06-064.pdf (2006).
  3. A. Neumaier, Solving Ill-Conditioned and Singular Linear Systems: A Tutorial on Regularization, SIAM Rev. 40 (1998), 636-666.
  4. K. Koh, S.-J. Kim, S. Boyd, An interior-point method for L1-regularized logistic regression, J. Mach. Learn. Res. 8 (2007), 1519-1555.
  5. W.J. Fu, Penalized Regressions: The Bridge versus the Lasso, J. Comput. Graph. Stat. 7 (1998), 397-416.
  6. D.L. Donoho, Denoising via soft-thresholding, IEEE Trans. Info. Theory, 41 (1995), 613-627.
  7. D.L. Donoho, I.M. Johnstone, Ideal spatial adaptation by wavelet shrinkage, Biometrika. 81 (1994), 425-455.
  8. B. Efron, The Estimation of Prediction Error: Covariance Penalties and Cross-Validation, J. Amer. Stat. Assoc. 99 (2004), 619-632.
  9. B. Efron, T. Hastie, I. Johnstone, R. Tibshirani, Least angle regression, Ann. Stat. 32 (2004), 407-499.
  10. M. Schmidt, G. Fung, R. Rosales, Fast Optimization Methods for L1 Regularization: A Comparative Study and Two New Approaches, in: J.N. Kok, J. Koronacki, R.L. de Mantaras, S. Matwin, D. Mladeniˇc, A. Skowron (Eds.), Machine Learning: ECML 2007, Springer Berlin Heidelberg, Berlin, Heidelberg, 2007: pp. 286-297.
  11. J.A. Tropp. Just relax: Convex programming methods for subset selection and sparse approximation. IEEE Trans. Inform. Theory, 51 (3) (2006), 1030-1051.
  12. B.A. Turlach, Shape constrained smoothing using smoothing splines, Comput. Stat. 20 (2005), 81-104.
  13. B.T. Polyak, Introduction to Optimization. Optimization Software Inc. Publication Division, New York, 1987.
  14. Y. Nesterov, A method of solving a convex programming problem with convergence rate O(1/k2 ), Sov. Math. Dokl. 27 (2) (1983), 372-376.
  15. S.-J. Kim, K. Koh, M. Lustig, S. Boyd, D. Gorinevsky, An Interior-Point Method for Large-Scale l1-Regularized Least Squares, IEEE J. Sel. Top. Signal Process. 1 (2007), 606-617.
  16. P. Zhao, B. Yu, On model selection consistency of Lasso. J. Mach. Learn. Res. 7 (2006), 2541-2563.