class Ai4cr::NeuralNetwork::Backpropagation
- Ai4cr::NeuralNetwork::Backpropagation
- Reference
- Object
Overview
= Introduction
This is an implementation of a multilayer perceptron network, using the backpropagation algorithm for learning.
Backpropagation is a supervised learning technique (described by Paul Werbos in 1974, and further developed by David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams in 1986)
= Features
- Support for any network architecture (number of layers and neurons)
- Configurable propagation function
- Optional usage of bias
- Configurable momentum
- Configurable learning rate
- Configurable initial weight function
- 100% Crystal code, no external dependency
= Parameters
Use class method get_parameters_info to obtain details on the algorithm parameters. Use set_parameters to set values for this parameters.
- :bias_disabled => If true, the algorithm will not use bias nodes. False by default.
- :initial_weight_function => f(n, i, j) must return the initial weight for the conection between the node i in layer n, and node j in layer n+1. By default a random number in [-1, 1) range.
- :propagation_function => By default: lambda { |x| 1/(1+Math.exp(-1*(x))) }
- :derivative_propagation_function => Derivative of the propagation function, based on propagation function output. By default: lambda { |y| y*(1-y) }, where y=propagation_function(x)
- :learning_rate => By default 0.25
- :momentum => By default 0.1. Set this parameter to 0 to disable momentum
= How to use it
# Create the network with 4 inputs, 1 hidden layer with 3 neurons, # and 2 outputs net = Ai4cr::NeuralNetwork::Backpropagation.new([4, 3, 2])
# Train the network
- times do |i|
net.train(example[i], result[i])
end
# Use it: Evaluate data with the trained network net.eval([12, 48, 12, 25]) => [0.86, 0.01]
More about multilayer perceptron neural networks and backpropagation:
- http://en.wikipedia.org/wiki/Backpropagation
- http://en.wikipedia.org/wiki/Multilayer_perceptron
= About the project Ported By:: Daniel Huffman Url:: https://github.com/drhuffman12/ai4cr
Based on:: Ai4r Author:: Sergio Fierens License:: MPL 1.1 Url:: http://ai4r.org
Defined in:
ai4cr/neural_network/backpropagation.crConstructors
Instance Method Summary
- #activation_nodes : Array(Array(Float64))
- #activation_nodes=(activation_nodes)
-
#backpropagate(expected_output_values)
Propagate error backwards
-
#calculate_error(expected_output)
Calculate quadratic error for a expected output value Error = 0.5 sum( (expected_value[i] - output_value[i])*2 )
-
#calculate_internal_deltas
Calculate deltas for hidden layers
-
#calculate_output_deltas(expected_values)
Calculate deltas for output layer
- #check_input_dimension(inputs)
- #check_output_dimension(outputs)
- #deltas
- #derivative_propagation_function
- #bias_disabled : Bool
- #bias_disabled=(bias_disabled)
-
#eval(input_values)
Evaluates the input.
-
#eval_result(input_values)
Evaluates the input and returns most active node E.g.
-
#feedforward(input_values)
Propagate values forward
- #height
- #height=(height)
- #hidden_qty
- #hidden_qty=(hidden_qty)
-
#init_activation_nodes
Initialize neurons structure.
-
#init_last_changes
Momentum usage need to know how much a weight changed in the previous training.
-
#init_network
Initialize (or reset) activation nodes and weights, with the provided net structure and parameters.
-
#init_weights
Initialize the weight arrays using function specified with the initial_weight_function parameter
- #initial_weight_function
- #last_changes : Array(Array(Array(Float64)))
- #last_changes=(last_changes)
- #learning_rate : Float64
- #learning_rate=(learning_rate)
-
#marshal_dump
Custom serialization.
- #marshal_load(tup)
- #momentum : Float64
- #momentum=(momentum)
- #propagation_function
- #structure : Array(Int32)
- #structure=(structure)
-
#train(inputs, outputs)
This method trains the network using the backpropagation algorithm.
-
#update_weights
Update weights after @deltas have been calculated.
- #weights : Array(Array(Array(Float64)))
- #weights=(weights)
- #width
- #width=(width)
Constructor Detail
Instance Method Detail
Calculate quadratic error for a expected output value Error = 0.5 sum( (expected_value[i] - output_value[i])*2 )
Evaluates the input. E.g.
net = Backpropagation.new([4, 3, 2])
net.eval([25, 32.3, 12.8, 1.5])
# => [0.83, 0.03]
Evaluates the input and returns most active node E.g.
net = Backpropagation.new([4, 3, 2])
net.eval_result([25, 32.3, 12.8, 1.5])
# eval gives [0.83, 0.03]
# => 0
Momentum usage need to know how much a weight changed in the previous training. This method initialize the @last_changes structure with 0 values.
Initialize (or reset) activation nodes and weights, with the provided net structure and parameters.
Initialize the weight arrays using function specified with the initial_weight_function parameter
Custom serialization. It used to fail trying to serialize because it uses lambda functions internally, and they cannot be serialized. Now it does not fail, but if you customize the values of
- initial_weight_function
- propagation_function
- derivative_propagation_function you must restore their values manually after loading the instance.
This method trains the network using the backpropagation algorithm.
input: Networks input
output: Expected output for the given input.
This method returns the network error: => 0.5 sum( (expected_value[i] - output_value[i])*2 )