After my previous attempt, I managed to train a neural network to express the sine function. I used the ai4r Ruby gem:
require 'ai4r'
srand 1
net = Ai4r::NeuralNetwork::Backpropagation.new([1, 60, 1])
net.learning_rate = 0.01
#net.propagation_function = lambda { |x| 1.0 / ( 1.0 + Math::exp( -x ) ) }
def normalise(x, xmin, xmax, ymin, ymax)
xrange = xmax - xmin
yrange = ymax - ymin
return ymin + (x - xmin) * (yrange.to_f / xrange)
end
training_data = Array.new
test = Array.new
i2 = 0.0
320.times do |i|
i2 += 0.1
hash = Hash.new
output = Math.sin(i2.to_f)
input = i2.to_f
hash.store(:input,[normalise(input,0.0,32.0,0.0,1.0)])
hash.store(:expected_result,[normalise(output,-1.0,1.0,0.0,1.0)])
training_data.push(hash)
test.push([normalise(output,-1.0,1.0,0.0,1.0)])
end
puts "#{test}"
puts "#{training_data}"
time = Time.now
999999.times do |i|
error = 0.0
training_data.each do |d|
error+=net.train(d[:input], d[:expected_result])
end
if error < 0.26
break
end
print "Times: #{i}, error: #{error} \r"
end
time2 = Time.now
puts "#{time2}-#{time} = #{time2-time} Sekunden gebraucht."
serialized = Marshal.dump(net)
File.open("net.saved", "w+") { |file| file.write(serialized) }
Everything worked out fine. The network was trained in 4703.664857 seconds.
The network will be trained much faster when I normalise the input/output to a number between 0 and 1. ai4r
uses a sigmoid function, so it's clear that it does not output negative values. But why do I have to normalise the input values? Does this kind of neural network only accept input values < 1?
In the sine example, is it possible to input any number as in:
Input: -10.0 -> Output: 0.5440211108893699
Input: 87654.322 -> Output: -0.6782453567239783
Input: -9878.923 -> Output: -0.9829544956991526
or do I have to define the range?