Table of Links
-
Convex Relaxation Techniques for Hyperbolic SVMs
B. Solution Extraction in Relaxed Formulation
C. On Moment Sum-of-Squares Relaxation Hierarchy
E. Detailed Experimental Results
F. Robust Hyperbolic Support Vector Machine
F Robust Hyperbolic Support Vector Machine
In this section, we propose the robust version of hyperbolic support vector machine without implemention. This is different from the practice of adversarial training that searches for adversarial samples on the fly used in the machine learning community, such as Weber et al. [7]. Rather, we predefine an uncertainty structure for data features and attempt to write down the corresponding optimization formulation, which we call the robust counterpart, as described in [42, 43].
Then, by adding the uncertainty set to the constraints, we have
where the last step is a rewriting into the robust counterpart (RC). We present the 𝑙∞ norm bounded robust HSVM as follows,
Note that since 𝑦𝑖 ∈ {−1, 1}, we may drop the 𝑦𝑖 term in the norm and subsequently write down the SDP relaxation to this non-convex QCQP problem and solve it efficiently with
For the implementation in MOSEK, we linearize the 𝑙1 norm term by introducing extra auxiliary variables, which we do not show here. The moment relaxation can be implemented likewise, since this is constraint-wise uncertainty and we preserve the same sparsity pattern so that the same sparse moment relaxation applies.
Authors:
(1) Sheng Yang, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA ([email protected]);
(2) Peihan Liu, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA ([email protected]);
(3) Cengiz Pehlevan, John A. Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, MA, Center for Brain Science, Harvard University, Cambridge, MA, and Kempner Institute for the Study of Natural and Artificial Intelligence, Harvard University, Cambridge, MA ([email protected]).
This paper is