I'm trying to calculate the kernel density function of a list of values:
x=[-0.04124324405924407, 0, 0.005249724476788287, 0.03599351958245578, -0.00252785423151014, 0.01007584102031178, -0.002510349639322063, -0.01264302961474806, -0.01797169063489579]
following this website: http://mark-kay.net/2013/12/24/kernel-density-estimation/ I want to calculate the best value for bandwidth, so I wrote this piece of code:
from sklearn.grid_search import GridSearchCV
grid = GridSearchCV(KernelDensity(),{'bandwidth': np.linspace(-1.0, 1.0, 30)},cv=20) # 20-fold cross-validation
grid.fit(x[:, None])
grid.best_params_
but when I run this:
grid.fit(x[:, None])
I get this error:
Error: TypeError: list indices must be integers, not tuple
Does anyone know how to fix it? Thanks
