class Ai4cr::NeuralNetwork::Backpropagation

Overview

= Introduction

This is an implementation of a multilayer perceptron network, using the backpropagation algorithm for learning.

Backpropagation is a supervised learning technique (described by Paul Werbos in 1974, and further developed by David E. Rumelhart, Geoffrey E. Hinton and Ronald J. Williams in 1986)

= Features

= Parameters

Use class method get_parameters_info to obtain details on the algorithm parameters. Use set_parameters to set values for this parameters.

= How to use it

# Create the network with 4 inputs, 1 hidden layer with 3 neurons, # and 2 outputs net = Ai4cr::NeuralNetwork::Backpropagation.new([4, 3, 2])

# Train the network

  1. times do |i|
net.train(example[i], result[i])

end

# Use it: Evaluate data with the trained network net.eval([12, 48, 12, 25]) => [0.86, 0.01]

More about multilayer perceptron neural networks and backpropagation:

= About the project Ported By:: Daniel Huffman Url:: https://github.com/drhuffman12/ai4cr

Based on:: Ai4r Author:: Sergio Fierens License:: MPL 1.1 Url:: http://ai4r.org

Defined in:

ai4cr/neural_network/backpropagation.cr

Constructors

Instance Method Summary

Constructor Detail

def self.new(structure : Array(Int32)) #

[View source]

Instance Method Detail

def activation_nodes : Array(Array(Float64)) #

[View source]
def activation_nodes=(activation_nodes) #

[View source]
def backpropagate(expected_output_values) #

Propagate error backwards


[View source]
def calculate_error(expected_output) #

Calculate quadratic error for a expected output value Error = 0.5 sum( (expected_value[i] - output_value[i])*2 )


[View source]
def calculate_internal_deltas #

Calculate deltas for hidden layers


[View source]
def calculate_output_deltas(expected_values) #

Calculate deltas for output layer


[View source]
def check_input_dimension(inputs) #

[View source]
def check_output_dimension(outputs) #

[View source]
def deltas #

[View source]
def derivative_propagation_function #

[View source]
def bias_disabled : Bool #

[View source]
def bias_disabled=(bias_disabled) #

[View source]
def eval(input_values) #

Evaluates the input. E.g.

net = Backpropagation.new([4, 3, 2])
net.eval([25, 32.3, 12.8, 1.5])
    # =>  [0.83, 0.03]

[View source]
def eval_result(input_values) #

Evaluates the input and returns most active node E.g.

net = Backpropagation.new([4, 3, 2])
net.eval_result([25, 32.3, 12.8, 1.5])
    # eval gives [0.83, 0.03]
    # =>  0

[View source]
def feedforward(input_values) #

Propagate values forward


[View source]
def height #

[View source]
def height=(height) #

[View source]
def hidden_qty #

[View source]
def hidden_qty=(hidden_qty) #

[View source]
def init_activation_nodes #

Initialize neurons structure.


[View source]
def init_last_changes #

Momentum usage need to know how much a weight changed in the previous training. This method initialize the @last_changes structure with 0 values.


[View source]
def init_network #

Initialize (or reset) activation nodes and weights, with the provided net structure and parameters.


[View source]
def init_weights #

Initialize the weight arrays using function specified with the initial_weight_function parameter


[View source]
def initial_weight_function #

[View source]
def last_changes : Array(Array(Array(Float64))) #

[View source]
def last_changes=(last_changes) #

[View source]
def learning_rate : Float64 #

[View source]
def learning_rate=(learning_rate) #

[View source]
def marshal_dump #

Custom serialization. It used to fail trying to serialize because it uses lambda functions internally, and they cannot be serialized. Now it does not fail, but if you customize the values of

  • initial_weight_function
  • propagation_function
  • derivative_propagation_function you must restore their values manually after loading the instance.

[View source]
def marshal_load(tup) #

[View source]
def momentum : Float64 #

[View source]
def momentum=(momentum) #

[View source]
def propagation_function #

[View source]
def structure : Array(Int32) #

[View source]
def structure=(structure) #

[View source]
def train(inputs, outputs) #

This method trains the network using the backpropagation algorithm.

input: Networks input

output: Expected output for the given input.

This method returns the network error: => 0.5 sum( (expected_value[i] - output_value[i])*2 )


[View source]
def update_weights #

Update weights after @deltas have been calculated.


[View source]
def weights : Array(Array(Array(Float64))) #

[View source]
def weights=(weights) #

[View source]
def width #

[View source]
def width=(width) #

[View source]