There is no such thing as a "decision boundary equation" in case of such complex object as a SVM model with RBF kernel. At least not directly.
First, SVM constructs a hyperplane w
, which is then used for separating data by calculating the inner product <w,x>
and checking the sign of <w,x>+b
(where b
is a trained threshold). While in the linear case we can simply reconstruct the w
by taking SUM y_i alpha_i x_i
, where x_i
are support vectors, y_i
their classes and alpha_i
the dual coefficient found in the optimization process, it is much more complex when we are dealing with infinitely dimensional space induced by RBF kernel. The so called kernel trick shows, that we can calculate the inner product <w,x>+b
using a kernel easily, so we can classify without computing the actual w
. So what is w
exactly? It is a linear combination of gaussians centered in support vectors (some of which have negative coefficients). You can again compute SUM y_i alpha_i f(x_i)
, where f
is a feature projection (in this case it would be a function returning gaussian distribution centered in given point, with variance equal to 1/(2gamma)
. The actual decision boundary is now described as points where the inner product of this function and the gaussian centered in this point is equal to -b
.
If your question concerns just plotting the decision boundary you can do it by creating a mesh grid, computing SVM decision function and plotting the contour plot
Your question asks about decision boundary, but your code actually runs regression, not classification. In such case what is more probable is that you are actually looking for the regression line, not the decision boundary, but the problem is very fully analogous to the classification case - it is still highly non trivial to "take out" the information regarding the equation as it is really just a hyperplane in the inifinitely dimensional space. You can still plot it (in case of regression in even simplier way then with SVC), but there is no nice, "closed form" equation of your regression. It is still defined by these support vectors and the inner product defined by the kernel.