I need to execute a theano function a number of times via scan in order to sum-up a cost function and use it in a gradient computation. I'm familiar with the deep-learning tutorials that do this but my data slicing and some other complications means I need to do this a little different. Below is a much simplified version of what I'm trying to do..
tn = testnet()
cost = tn.single_cost( )
x = theano.shared(numpy.asarray([7.1,2.2,3.4], dtype='float32'))
index = T.lscalar('index')
test_fn = theano.function(inputs=[index], outputs=cost,
givens={tn.x:x[index:index+1]} )
def step(curr):
return T.constant( test_fn( curr ) )
outs,_ = theano.scan(step, T.arange(2))
out_fn = theano.function(inputs=[], outputs=outs)
print out_fn()
In the scan function, the call to test_fn(curr) is giving the error... Expected an array-like object, but found a Variable: maybe you are trying to call a function on a (possibly shared) variable instead of a numeric array?')
Even if I pass in an array of values instead of putting the T.arrange(2) in place, I still get the same error. Is there a reason you can't call a function from scan?
In general I'm wondering if there is a way to call a function like this with a series of indexes so that the output can feed into a T.grad() computation (not shown).