Statistics and Its Interface

Volume 17 (2024)

Number 2

Special issue on statistical learning of tensor data

Density-convoluted tensor support vector machines

Pages: 231 – 247

DOI: https://dx.doi.org/10.4310/23-SII796

Authors

Boxiang Wang (Department of Statistics and Actuarial Science, University of Iowa, Iowa City, Ia., U.S.A.)

Le Zhou (Department of Mathematics, Hong Kong Baptist University, Kowloon Tong, Hong Kong)

Jian Yang (Yahoo! Research, Sunnyvale, California, U.S.A.)

Qing Mai (Department of Statistics, Florida State University, Tallahassee, Fl., U.S.A.)

Abstract

With the emergence of tensor data (also known as multi-dimensional arrays) in many modern applications such as image processing and digital marketing, tensor classification is gaining increasing attention. Although there is a rich toolbox of classification methods for vector-based data, these traditional methods may not be adequate for tensor data classification. In this paper, we propose a new classifier called density-convoluted tensor support vector machine (DCT‑SVM). This method is motivated by applying a kernel density convolution method on the SVM loss to induce a new family of classification loss functions. To establish the theoretical foundation of DCT‑SVM, the probabilistic order of magnitude for its excess risk is systematically studied. For efficiently computing DCT‑SVM, we develop a fast monotone accelerated proximal gradient descent algorithm and show the convergence of the algorithm. With simulation studies, we demonstrate the superior performance of DCT‑SVM over many popular classification methods. We further demonstrate the real potential of DCT‑SVM using a modern data application for online advertising.

Keywords

kernel density estimation, large-margin classification, non-convex optimization, support vector machines, tensor data classification

2010 Mathematics Subject Classification

Primary 62H30. Secondary 62-07.

The full text of this article is unavailable through your IP address: 3.138.67.56

Received 1 October 2022

Accepted 4 April 2023

Published 1 February 2024