The multicategory SVM (MSVM) of Lee et al. (2004) is a natural generalization of the classical, binary support vector machines (SVM). However, its use has been limited by computational difficulties. The simplex-cone SVM (SCSVM) of Mroueh et al. (2012) is a computationally efficient multicategory classifier, but its use has been limited by a seemingly opaque interpretation. We show that MSVM and SCSVM are in fact exactly equivalent, and provide a bijection between their tuning parameters. MSVM may then be entertained as both a natural and computationally efficient multicategory extension of SVM. We further provide a Donsker theorem for finite-dimensional kernel MSVM and partially answer the open question pertaining to the very competitive performance of One-vs-Rest methods against MSVM. Furthermore, we use the derived asymptotic covariance formula to develop an inverse-variance weighted classification rule which improves on the One-vs-Rest approach.

Read paper in Proceedings of Machine Learning Research

More on this topic

BFI Working Paper·Feb 2, 2026

Diversionary Escalation: Theory and Evidence from Eastern Ukraine

Natalie Ayers, Christopher W. Blair, Joseph J. Ruggiero, Austin L. Wright, and Konstantin Sonin
Topics: Uncategorized
BFI Working Paper·Jan 26, 2026

Never Enough: Dynamic Status Incentives in Organizations

Leonardo Bursztyn, Ewan Rawcliffe, and Hans-Joachim Voth
Topics: Uncategorized
BFI Working Paper·Jan 21, 2026

Rational Disagreement

Nabil I. Al-Najjar and Harald Uhlig
Topics: Uncategorized