I have tried using logistic regression with polynomial feature, and fortunately its working fine for me and also I am able to plot the decision curve. I have used map_feature function for polynomial features. (I referred Prof Andrew's notes on logistic regression with regularization) : http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=MachineLearning&doc=exercises/ex5/ex5.html
Now I am trying to achieve the same using Gaussian Kernel instead of taking polynomial features. Fortunately my cost function (j_theta) works fine and decreases after every iteration and I get my final theta value. The problem that I face now is HOW DO I PLOT THE DECISION BOUNDARY here
I am using Octave to develop the algorithms and plot the graphs..
Below is the details to my data set size
Original Data set:
Data Set (x): [20*3] where the first column is the intercept or the bias column
1.00 2.0000 1.0000
1.00 3.0000 1.0000
1.00 4.0000 1.0000
1.00 5.0000 2.0000
1.00 5.0000 3.0000
.
.
.
Data set with new features after implementation of Gaussian Kernal
Data set (f) : [20*21] the first column is the intercept column with all values as 1
1.0000e+000 1.0000e+000 6.0653e-001 1.3534e-001 6.7379e-003 . . . . . . . .
1.0000e+000 6.0653e-001 1.0000e+000 6.0653e-001 8.2085e-002 . . . . . . . .
1.0000e+000 1.3534e-001 6.0653e-001 1.0000e+000 3.6788e-001
1.0000e+000 6.7379e-003 8.2085e-002 3.6788e-001 1.0000e+000
. .
. .
. .
. .
. .
The cost Function graph that I get after applying gradient descent on my new featured data set (f) is :
Hence I get my new theta value:
theta: [21*1]
3.8874e+000
1.1747e-001
3.5931e-002
-8.5937e-005
-1.2666e-001
-1.0584e-001
.
.
.
The problem that I face now is how do I construct my decision curve upon my original dataset having new features data set and theta value. I have no clue how do I proceed.
I would be glad if I get some clue, or tutorials, or link that could help me solve my problem.
Appreciate you help . Thanks