I am trying to implement a logistic regression solver in MATLAB and i am finding the weights by stochastic gradient descent. I am running into a problem where my data seems to produce an infinite cost, and no matter what happens it never goes down... Both these seem perfectly fine, i cant imagine why my cost function would ALWAYS return infinite.
Here is my training data where the first column is the class (Either 1 or 0) and the next seven columns are the features i am trying to regress on.
weightVector = weightVector - learningRate * gradient
. I see more clearly that we move in the opposite direction of the gradient towards a minima of the cost function. – tashuhka