3
votes

I have some confusions regarding SVM as I don't have much of a mathematical background.

Let the equation of hyperplane(in any dimension) be w'x+b=0, now I know that weight vector w is orthogonal to this hyperplane.

Is the equation w'x+b=0 just a general equation of a hyperplane having nothing to do with a SVM, i.e., if w and x are general vectors, then will any hyperplane of the form w'x+b=0 will have the vector w orthogonal to the hyperplane?

Consider the scenario below:

SVM

Now while minimizing the objective function 0.5*||w||^2, we take the constraints to be w'x+b>=1 for examples in class 2 and w'x+b<=-1 for examples in class 1. So if I change these equations to w'x+b>=2 and w'x+b<=-2, will I get a classifier with even larger margin? If, then why don't we use it? If not, then why not?

1
This is probably more appropriate for the Mathematics site.Prune

1 Answers

0
votes

Yes, any hyperplane fits that equation, and w' will be orthogonal.

No, you won't get a margin twice as large: the SVM algorithm finds the largest margin. What you'll get is b having coefficients twice as large as the previous one.