SVM light is an implementation of Vapnik's Support Vector Machine [Vapnik, 1995] for the problem of pattern recognition, for the problem of regression, and for the problem of learning a ranking function.

For greater accuracy on low- through medium-dimensional data sets, train a support vector machine (SVM) model using fitrsvm.. For reduced computation time on high-dimensional data sets, efficiently train a linear regression model, such as a linear SVM model, using fitrlinear. Understanding Support Vector Machine Regression Mathematical Formulation of SVM Regression Overview. Support vector machine (SVM) analysis is a popular machine learning tool for classification and regression, first identified by Vladimir Vapnik and his colleagues in 1992. Support Vector Machine Classification Support vector machines for binary or multiclass classification For greater accuracy and kernel-function choices on low- through medium-dimensional data sets, train a binary SVM model or a multiclass error-correcting output codes (ECOC) model containing SVM binary learners using the Classification Learner app.

In order to better understand your data and the way SVM works is to begin with a linear SVM. This tybe of SVM is interpretable, which means that each of your 41 features has a weight (or 'importance') associated with it after training. You can then use plot3() with your data on 3 of the 'best' features from the linear svm.

Support Vector Machine A more convenient formulation The previous problem is equivalent to min w,b 1 2 ∥w∥2 2 subject to y i(w·x +b) ≥ 1 for all 1 ≤ i ≤ n. Remarks: This is an optimization problem with linear, inequality constraints. This is an implementation of the SVM algorithm. To do this, I solve the dual L1-regularized and kernelized optimization problem via classic QP using CVX and (in the future) via the SMO algorithm.