# Support Vector Machine

## Description

SVM treats every example as a point in a high dimensional space and creates an imaginary hyperplane that separates examples with positive labels from examples with negative ones.

It requires positive labels to have a value +1, negative one to have -1.

### Equation

The ecuation of the Hyperplane is given by two parameters, a real valued vector **w** of the same dimensionality as our input feature vector, and a real number **b** like this:

`wx -b = 0`

where wx means w1x1 + w2x2 + … + wDxD (where **D** = dimension of the input feature vector).

$$ \begin{equation} \sum\_{a=1}^Dw^{a}x^{a} \end{equation} $$

The predicted label for some input feature vector x would be:

$$ \begin{equation} y = sign(wx - b) \end{equation} $$

`sign`

in this case is a function that will return -1 if the number is negative, +1 if it’s positive.

The goal for SVM it is to find the optimal values for **w*** and **b*** for parameters **w** and **b**. Once those optimal values are found, the **model** is defined as:

$$ \begin{equation} f(x) = sign(w*x - b*) \end{equation} $$

### Optimization

the model will find optimal values of w* and b* using optimization.