In high-dimensional regression scenarios with low signal-to-noise ratios, we assess the predictive performance of several machine learning algorithms. Theoretical insights show Ridge regression’s superiority in exploiting weak signals, surpassing a zero benchmark. In contrast, Lasso fails to exceed this baseline, indicating its learning limitations. Simulations reveal that Random Forest generally outperforms Gradient Boosted Regression Trees when signals are weak. Moreover, Neural Networks with ℓ2-regularization excel in capturing nonlinear functions of weak signals. Our empirical analysis across six economic datasets suggests that the weakness of signals, not necessarily the absence of sparsity, may be Lasso’s major limitation in economic predictions.

More on this topic

BFI Working Paper·Apr 14, 2025

Paths to the Periphery

James Robinson
Topics: Uncategorized
BFI Working Paper·Apr 7, 2025

The Conflict-of-Interest Discount in the Marketplace of Ideas

John M. Barrios, Filippo Lancieri, Joshua Levy, Shashank Singh, Tommaso Valletti, and Luigi Zingales
Topics: Uncategorized
BFI Working Paper·Feb 20, 2025

Non est Disputandum de Generalizability? A Glimpse into The External Validity Trial

John List
Topics: Uncategorized