I have some confusions regarding SVM as I don't have much of a mathematical background.
Let the equation of hyperplane(in any dimension) be w'x+b=0
, now I know that weight vector w
is orthogonal to this hyperplane.
Is the equation w'x+b=0
just a general equation of a hyperplane having nothing to do with a SVM, i.e., if w
and x
are general vectors, then will any hyperplane of the form w'x+b=0
will have the vector w
orthogonal to the hyperplane?
Consider the scenario below:
Now while minimizing the objective function 0.5*||w||^2
, we take the constraints to be w'x+b>=1
for examples in class 2
and w'x+b<=-1
for examples in class 1
. So if I change these equations to w'x+b>=2
and w'x+b<=-2
, will I get a classifier with even larger margin? If, then why don't we use it? If not, then why not?