The multicategory SVM (MSVM) of Lee et al. (2004) is a natural generalization of the classical, binary support vector machines (SVM). However, its use has been limited by computational difficulties. The simplex-cone SVM (SCSVM) of Mroueh et al. (2012) is a computationally efficient multicategory classifier, but its use has been limited by a seemingly opaque interpretation. We show that MSVM and SCSVM are in fact exactly equivalent, and provide a bijection between their tuning parameters. MSVM may then be entertained as both a natural and computationally efficient multicategory extension of SVM. We further provide a Donsker theorem for finite-dimensional kernel MSVM and partially answer the open question pertaining to the very competitive performance of One-vs-Rest methods against MSVM. Furthermore, we use the derived asymptotic covariance formula to develop an inverse-variance weighted classification rule which improves on the One-vs-Rest approach.

Read paper in Proceedings of Machine Learning Research

More on this topic

BFI Working Paper·Jun 18, 2025

Innovator Networks Within the Firm and the Quality of Innovation

Michael Gibbs, Friederike Mengel, and Christoph Siemroth
Topics: Uncategorized
BFI Working Paper·Jun 17, 2025

The Social Desirability Atlas

Leonardo Bursztyn, Ingar K. Haaland, Nicolas Röver, and Christopher Roth
Topics: Uncategorized
BFI Working Paper·Jun 10, 2025

Measuring Markets for Network Goods

Leonardo Bursztyn, Matthew Gentzkow, Rafael Jiménez-Durán, Aaron Leonard, Filip Milojević, and Christopher Roth
Topics: Uncategorized