0
votes

I’m a currently doing Andrew Ng course on coursera, I tried to used what I learned about logistic regression on a dataset. But I can’t make the cost function to decrease.

I tried different learning rates(0.001,0.003, 0.0001…) and number of iterations. It might be that i wrote the function incorrectly but i cant find the error

import numpy as np
import scipy as sc
import matplotlib.pyplot as plt
from sklearn.datasets import load_iris

iris = load_iris()
X = iris.data[:,:2]
Y = (iris.target != 0)*1
m = Y.size
th = np.random.rand(1,3)#theta
xo = np.ones((m,1))
Xi = np.concatenate((xo,X),axis=1)#X intercept
sigma = lambda z: 1/(1+(np.e**-z))
cost = lambda h,y: (np.sum(-y.T*np.log(h)-(1-y).T*np.log(1-h)))/m
grad = lambda h,y,x : np.sum(x.T@(h-y))/m
ite = 100000
lr = 0.0015
for i in range(ite):
    z = [email protected]
    th = th- lr*grad(sigma(z),Y,Xi)
    print(cost(sigma(z),Y))
1
Can't reproduce: NameError: name 'Y' is not defined - Dov Rine
Welcome to SO; please see how to create a minimal reproducible example - desertnaut
@desertnaut Sorry i am new in stackoverflow, I edited it, so now is reproducible - Gian Peri

1 Answers

0
votes

Fixed, I dont know why i wrote np.sum before the gradient. But now works

grad = lambda h,y,x : x.T@(h-y))/m