In high-dimensional regression scenarios with low signal-to-noise ratios, we assess the predictive performance of several machine learning algorithms. Theoretical insights show Ridge regression’s superiority in exploiting weak signals, surpassing a zero benchmark. In contrast, Lasso fails to exceed this baseline, indicating its learning limitations. Simulations reveal that Random Forest generally outperforms Gradient Boosted Regression Trees when signals are weak. Moreover, Neural Networks with ℓ2-regularization excel in capturing nonlinear functions of weak signals. Our empirical analysis across six economic datasets suggests that the weakness of signals, not necessarily the absence of sparsity, may be Lasso’s major limitation in economic predictions.

More on this topic

BFI Working Paper·Sep 18, 2025

The Impact of Language on Decision-Making: Auction Winners are Less Cursed in a Foreign Language

Fang Fu, Leigh H. Grant, Ali Hortaçsu, Boaz Keysar, Jidong Yang, and Karen J. Ye
Topics: Uncategorized
BFI Working Paper·Aug 20, 2025

Partial Language Acquisition: The Impact of Conformity

William A. Brock, Bo Chen, Steven Durlauf, and Shlomo Weber
Topics: Uncategorized
BFI Working Paper·Aug 12, 2025

Seemingly Virtuous Complexity in Return Prediction

Stefan Nagel
Topics: Uncategorized