SlideShare une entreprise Scribd logo
1  sur  156
Télécharger pour lire hors ligne
J e r o e n S o e t e r s
NEUROEVOLUTION
A hitchhiker’s guide to neuroevolution in
Erlang.
MEET GARY
2
WHAT IS MACHINE LEARNING?
• Artificial Neural Networks
• Genetic Algorithms
3
ARTIFICIAL NEURAL
NETWORKS
4
BIOLOGICAL NEURAL NETWORKS
5
Dendrites
Soma
Axon
Synapse
A MODEL FOR A NEURON
6
Y
w1
w2
wn
x1
x2
xn
Dendrites Synapses Axon
Soma
A MODEL FOR A NEURON
6
Y
w1
w2
wn
x1
x2
xn
Input signals Weights Output signal
Neuron
HOW DOES THE NEURON DETERMINE IT’S OUTPUT?
Y =sign ∑xiwi - ⍬
7
n=1
n
ACTIVATION FUNCTION
8
X
Y
MEET FRANK
9
PERCEPTRON LEARNING RULE
℮(p) = Yd(p) - Y(p)
wi(p + 1) = wi(p) + α • wi(p) • ℮(p)
=
10
PERCEPTRON TRAINING ALGORITHM
11
weight training
start
stop
weights converged? yes
no
set weights and threshold to
random values [-0.5, 0.5]
activate the perceptron
LOGIC GATES
12
input values
x1 x2
x1 AND x2 x1 OR x2 x1 XOR x2
0 0 0 0 0
0 1 0 1 1
1 0 0 1 1
1 1 1 1 0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.3 -0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0 0.3
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0 -0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
00
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.3 -0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0 0.3
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
1 -0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
00
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.3 -0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
1 0.3
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0 -0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
10
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
-1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.2 -0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
1 0.2
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
1 -0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
01
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
13
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
1
0 0 0 0.3 -0.1 0 0 0.3 -0.1
0 1 0 0.3 -0.1 0 0 0.3 -0.1
1 0 0 0.3 -0.1 1 -1 0.2 -0.1
1 1 1 0.2 -0.1 0 1 0.3 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.3 0.0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
14
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
2
0 0 0 0.3 0.0 0 0 0.3 0.0
0 1 0 0.3 0.0 0 0 0.3 0.0
1 0 0 0.3 0.0 1 -1 0.2 0.0
1 1 1 0.2 0.0 1 0 0.2 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.3 0.0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
14
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
2
0 0 0 0.3 0.0 0 0 0.3 0.0
0 1 0 0.3 0.0 0 0 0.3 0.0
1 0 0 0.3 0.0 1 -1 0.2 0.0
1 1 1 0.2 0.0 1 0 0.2 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.3 0.0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
14
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
2
0 0 0 0.3 0.0 0 0 0.3 0.0
0 1 0 0.3 0.0 0 0 0.3 0.0
1 0 0 0.3 0.0 1 -1 0.2 0.0
1 1 1 0.2 0.0 1 0 0.2 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.2 0.0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
14
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
2
0 0 0 0.3 0.0 0 0 0.3 0.0
0 1 0 0.3 0.0 0 0 0.3 0.0
1 0 0 0.3 0.0 1 -1 0.2 0.0
1 1 1 0.2 0.0 1 0 0.2 0.0
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.2 0.0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
15
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
3
0 0 0 0.2 0.0 0 0 0.2 0.0
0 1 0 0.2 0.0 0 0 0.2 0.0
1 0 0 0.2 0.0 1 -1 0.1 0.0
1 1 1 0.1 0.0 0 1 0.2 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.2 0.0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
15
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
3
0 0 0 0.2 0.0 0 0 0.2 0.0
0 1 0 0.2 0.0 0 0 0.2 0.0
1 0 0 0.2 0.0 1 -1 0.1 0.0
1 1 1 0.1 0.0 0 1 0.2 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.2 0.0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
15
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
3
0 0 0 0.2 0.0 0 0 0.2 0.0
0 1 0 0.2 0.0 0 0 0.2 0.0
1 0 0 0.2 0.0 1 -1 0.1 0.0
1 1 1 0.1 0.0 0 1 0.2 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.1 0.0
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
15
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
3
0 0 0 0.2 0.0 0 0 0.2 0.0
0 1 0 0.2 0.0 0 0 0.2 0.0
1 0 0 0.2 0.0 1 -1 0.1 0.0
1 1 1 0.1 0.0 0 1 0.2 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.2 0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
16
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
4
0 0 0 0.2 0.1 0 0 0.2 0.1
0 1 0 0.2 0.1 0 0 0.2 0.1
1 0 0 0.2 0.1 1 -1 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.2 0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
16
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
4
0 0 0 0.2 0.1 0 0 0.2 0.1
0 1 0 0.2 0.1 0 0 0.2 0.1
1 0 0 0.2 0.1 1 -1 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.2 0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
16
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
4
0 0 0 0.2 0.1 0 0 0.2 0.1
0 1 0 0.2 0.1 0 0 0.2 0.1
1 0 0 0.2 0.1 1 -1 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.1 0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
16
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
4
0 0 0 0.2 0.1 0 0 0.2 0.1
0 1 0 0.2 0.1 0 0 0.2 0.1
1 0 0 0.2 0.1 1 -1 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.1 0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
17
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
5
0 0 0 0.1 0.1 0 0 0.1 0.1
0 1 0 0.1 0.1 0 0 0.1 0.1
1 0 0 0.1 0.1 0 0 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.1 0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
17
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
5
0 0 0 0.1 0.1 0 0 0.1 0.1
0 1 0 0.1 0.1 0 0 0.1 0.1
1 0 0 0.1 0.1 0 0 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.1 0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
17
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
5
0 0 0 0.1 0.1 0 0 0.1 0.1
0 1 0 0.1 0.1 0 0 0.1 0.1
1 0 0 0.1 0.1 0 0 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.1 0.1
TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION
17
epoch
inputs
x1 x2
desired
output
Yd
initial weights
w1 w2
actual
output
Y
error
℮
final weights
w1 w2
5
0 0 0 0.1 0.1 0 0 0.1 0.1
0 1 0 0.1 0.1 0 0 0.1 0.1
1 0 0 0.1 0.1 0 0 0.1 0.1
1 1 1 0.1 0.1 1 0 0.1 0.1
Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
0.1 0.1
A LITTLE GEOMETRY…
18
0 1
1
x2
x1 0 1
1
x2
x10 1
1
x2
x1
x1 AND x2 x1 OR x2 x1 XOR x2
WE NEED MORE LAYERS
19
3
4
5
1
2
Input layer Hidden layer Output layer
x1
x2
Y
PROBLEMS WITH BACK PROPAGATION
•a training set of sufficient size is required
•topology of the network needs to be known in advance
•no recurrent connections are allowed
•activation function must be differentiable
Does not emulate the biological world
20
EVOLUTIONARY
COMPUTATION
21
MEET JOHN
22
THE CHROMOSOME
23
10 11 101 0
1 1000 11 0
CROSSOVER
24
0 111 0 11 0 1 100 1 10 0
parents
1 1000 11 0
CROSSOVER
24
0 111 0 11 0 1 100 1 10 0
✂
parents
✂
1 1000 11 0
1 10 00 111
offspring
CROSSOVER
24
0 111 1 10 0
✂
parents
✂
MUTATION
25
A D10 11 101 0
MUTATION
25
A DA D10 11 101 01 0
MUTATION
25
A DA D10 11 101 01 00 1
EVOLUTIONARY ALGORITHM
•represent the candidate solution as a chromosome
•chose the initial population size N, crossover probability (Pc) and mutation
probability (Pm)
•define a fitness function to measure the performance of the chromosome
•define the genetic operators for the chromosome
26
EVOLUTIONARY ALGORITHM
27
start
stop
generate a population
calculate fitness
termination criteria satisfied?
yes
no
new population size = N?
crossover and mutation
select pair for mating add to new population
replace population
no
yes
TRAVELING SALESMAN
28
A
B
D
F
G
C
E
H
EVOLUTIONARY ALGORITHM
•represent the candidate solution as a chromosome
•define a fitness function to measure the performance of the chromosome
•define the genetic operators for the chromosome
•chose the initial population size N, crossover probability Pc and mutation
probability Pm
29
THE CHROMOSOME
30
HG FE ABC D
EVOLUTIONARY ALGORITHM
•represent the candidate solution as a chromosome
•define a fitness function to measure the performance of the
chromosome
•define the genetic operators for the chromosome
•chose the initial population size N, crossover probability Pc and mutation
probability Pm
31
FITNESS FUNCTION
Fitness = 1 / total distance
32
EVOLUTIONARY ALGORITHM
•represent the candidate solution as a chromosome
•define a fitness function to measure the performance of the chromosome
•define the genetic operators for the chromosome
•chose the initial population size N, crossover probability Pc and mutation
probability Pm
33
CROSSOVER
34
G EBC H FA D A FEB H DC G
parents
CROSSOVER
34
G EBC H FA D A FEB H DC G
✂
✂
parents
H A D
offspring
CROSSOVER
34
G EBC H FA D A FEB H DC G
✂
✂
parents
H A D
offspring
CROSSOVER
34
G EBC H FA D A FEB H DC G
✂
✂
parents
B
H A D
offspring
CROSSOVER
34
G EBC H FA D A FEB H DC G
✂
✂
parents
B E
H A D
offspring
CROSSOVER
34
G EBC H FA D A FEB H DC G
✂
✂
parents
B E F
H A D
offspring
CROSSOVER
34
G EBC H FA D A FEB H DC G
✂
✂
parents
B E F C
H A D
offspring
CROSSOVER
34
G EBC H FA D A FEB H DC G
✂
✂
parents
B E F C G
MUTATION
35
HG FE ABC D
MUTATION
35
HG FE ABC DA D
MUTATION
35
HG FE ABC DA DAD
EVOLUTIONARY ALGORITHM
•represent the candidate solution as a chromosome
•define a fitness function to measure the performance of the chromosome
•define the genetic operators for the chromosome
•chose the initial population size N, crossover probability Pc and
mutation probability Pm
36
DEMO
37
DEMO
37
NEUROEVOLUTION
38
MEET GENE
39
SIMULATION
•inputs (sensors)
•outputs (actuators)
•fitness function
40
CLEANING ROBOT
41
FOREX TRADING
42
AND LOTS MORE…
•data compression
•training NPCs in a video game
•cyber warfare
43
EVOLUTIONARY ALGORITHM
•represent the candidate solution as a chromosome
•define a fitness function to measure the performance of the chromosome
•define the genetic operators for the chromosome
•chose the initial population size N, crossover probability Pc and mutation
probability Pm
44
THE CHROMOSOME
45
EVOLUTIONARY ALGORITHM
•represent the candidate solution as a chromosome
•define a fitness function to measure the performance of the
chromosome
•define the genetic operators for the chromosome
•chose the initial population size N, crossover probability Pc and mutation
probability Pm
46
FITNESS FUNCTION
Fitness = performance of network on an actual problem
47
EVOLUTIONARY ALGORITHM
•represent the candidate solution as a chromosome
•define a fitness function to measure the performance of the chromosome
•define the genetic operators for the chromosome
•chose the initial population size N, crossover probability Pc and mutation
probability Pm
48
CROSSOVER
Crossover doesn’t work for large neural nets!
49
MUTATE ACTIVATION FUNCTION
50
MUTATE ACTIVATION FUNCTION
50
ADD CONNECTION
51
ADD CONNECTION
51
ADD NEURON
52
ADD NEURON
52
OUTSPLICE
53
OUTSPLICE
53
MUTATION OPERATORS
and lots more…
54
EVOLUTIONARY ALGORITHM
•represent the candidate solution as a chromosome
•define a fitness function to measure the performance of the chromosome
•define the genetic operators for the chromosome
•chose the initial population size N, crossover probability Pc and
mutation probability Pm
55
RANDOM IMPACT MUTATION
Number of mutations = random(1, network size)
56
MEMETIC ALGORITHM
57
apply to a problem
start
generate a population
local search:
Hill Climber calculate effective fitness
select fit organisms
create offspring
STOCHASTIC HILL CLIMBER (LOCAL SEARCH)
58
start
new fitness > old fitness?
yes
no
stopping condition reached?
stop
apply NN to a problem
backup and perturb weights restore backed-up weights
A LANGUAGE FOR NEUROEVOLUTION
•The system must be able to handle very large numbers of concurrent activities
•Actions must be performed at a certain point in time or within a certain time
•Systems may be distributed over several computers
•The system is used to control hardware
•The software systems are very large
59
A LANGUAGE FOR NEUROEVOLUTION
59
•The system exhibits complex functionality such as, feature interaction.
•The systems should be in continuous operation for many years.
•Software maintenance (reconfiguration, etc) should be performed without
stopping the system.
•There are stringent quality, and reliability requirements.
•Fault tolerance
A LANGUAGE FOR NEUROEVOLUTION
59
•The system exhibits complex functionality such as, feature interaction.
•The systems should be in continuous operation for many years.
•Software maintenance (reconfiguration, etc) should be performed without
stopping the system.
•There are stringent quality, and reliability requirements.
•Fault tolerance
Bjarne Dacker. Erlang - A New Programming Language. Ericsson Review,
no 2, 1993.
MEET JOE
60
1:1 MAPPING
61
neuron
neuronneuron
neuron
process process
process process
processprocess
genotype
erlang
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
sync
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
sense
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
percept
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
forward
forward
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
forward
forward
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
forward
forward
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
action
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
{fitness, halt_flag}
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
sync
THE NEURAL NETWORK
62
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
THE EXOSELF
63
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
exoself
cortex
THE EXOSELF
63
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
exoself
{evaluation_completed, fitness}
cortex
THE EXOSELF
63
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
exoself
cortex
fitness > best fitness
THE EXOSELF
63
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
exoself
backup_weights
backup_weights
backup_weights
backup_weights
neuron neuron
neuronneuron
THE EXOSELF
63
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
exoself
neuron
neuron
perturb_weights
perturb_weights
THE EXOSELF
63
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
exoself
cortex
THE EXOSELF
63
neuron
neuronneuron
actuatorsensor
neuron
cortex
scape
exoself
cortex
reactivate
THE POPULATION MONITOR
64
population monitor
database
THE POPULATION MONITOR
64
population monitor
database
private private privateprivate private
THE POPULATION MONITOR
64
population monitor
database
private private privateprivate private
start
start start start
start
THE POPULATION MONITOR
64
population monitor
database
private private privateprivate private
THE POPULATION MONITOR
64
population monitor
database
private private privateprivate private
terminated
terminated terminated
terminated
terminated
THE POPULATION MONITOR
64
population monitor
database
private private privateprivate private
THE POPULATION MONITOR
64
population monitor
database
private privateprivate
THE POPULATION MONITOR
64
population monitor
database
private privateprivate private private
THE POPULATION MONITOR
64
population monitor
database
private privateprivate private private
start
start start start
start
THE DEVIL IS IN THE DETAILS
•recurrent connections
•newer generations get a higher chance for mutation
•neural plasticity
•public scapes and steady state evolution
65
POLE BALANCING
66
agent
cart
actions
percepts
DEMO
67
DEMO
67
BENCHMARK RESULTS
68
Method Single-Pole/Incomplete state
information
Double-Pole/Partial
Information W/O Damping
Double-Pole W/Damping
RWG 8557 415209 1232296
SANE 1212 262700 451612
CNE* 724 76906* 87623*
ESP 589 7374 26342
NEAT - - 6929
CMA-ES* - 3521* 6061*
CoSyNE* 127* 1249* 3416*
DXNN not performed 2359 2313
Our System 647 5184 4792
THE HANDBOOK
69
Feel free to reach out at:
e. jsoeters@thoughtworks.com
t. @JeroenSoeters
THANK YOU
DATA SCIENCE AND ENGINEERING
71

Contenu connexe

En vedette

6 Habits To Make A Difference
6 Habits To Make A Difference6 Habits To Make A Difference
6 Habits To Make A DifferenceHeyday ApS
 
งานสัปดาห์ที่3-กานต์รวี
งานสัปดาห์ที่3-กานต์รวีงานสัปดาห์ที่3-กานต์รวี
งานสัปดาห์ที่3-กานต์รวีkanrav
 
POWERSTRUCT Formal Capability Statement V1.0
POWERSTRUCT Formal Capability Statement V1.0POWERSTRUCT Formal Capability Statement V1.0
POWERSTRUCT Formal Capability Statement V1.0David Hine
 
אילת אילות אנרפוינט ישראל
אילת אילות אנרפוינט ישראלאילת אילות אנרפוינט ישראל
אילת אילות אנרפוינט ישראלAnnaKachur
 
36 Motivational Business Quotes to Keep you Going!
36 Motivational Business Quotes to Keep you Going!36 Motivational Business Quotes to Keep you Going!
36 Motivational Business Quotes to Keep you Going!Rebecca - The Savvy Marketer
 
Interesting facts about the grand canyon
Interesting facts about the grand canyonInteresting facts about the grand canyon
Interesting facts about the grand canyonAustin Gratham
 
Lanita Lasenberry Radiography Resume
Lanita Lasenberry Radiography ResumeLanita Lasenberry Radiography Resume
Lanita Lasenberry Radiography ResumeLanita Lasenberry
 
A Global Perspective of Intensification in relation to food security and clim...
A Global Perspective of Intensification in relation to food security and clim...A Global Perspective of Intensification in relation to food security and clim...
A Global Perspective of Intensification in relation to food security and clim...Sri Lmb
 
Independent healthcare in-house lawyers forum
Independent healthcare in-house lawyers forumIndependent healthcare in-house lawyers forum
Independent healthcare in-house lawyers forumBrowne Jacobson LLP
 
Of giant ferns and tiny prayer temples - Marianne Esders
Of giant ferns and tiny prayer temples - Marianne EsdersOf giant ferns and tiny prayer temples - Marianne Esders
Of giant ferns and tiny prayer temples - Marianne EsdersMarianne Esders
 
Deel communicatiebudget FOD Justitie ging naar hotel
Deel communicatiebudget FOD Justitie ging naar hotelDeel communicatiebudget FOD Justitie ging naar hotel
Deel communicatiebudget FOD Justitie ging naar hotelThierry Debels
 
Emocionalna inteligencija
Emocionalna inteligencijaEmocionalna inteligencija
Emocionalna inteligencijajcrnogorac
 
So You Think You Can Tweet....in Multiple Languages
So You Think You Can Tweet....in Multiple LanguagesSo You Think You Can Tweet....in Multiple Languages
So You Think You Can Tweet....in Multiple Languagescenter4edupunx
 

En vedette (20)

Resume - Lisa
Resume - LisaResume - Lisa
Resume - Lisa
 
SMK PROFITA BANDUNG
SMK PROFITA BANDUNG SMK PROFITA BANDUNG
SMK PROFITA BANDUNG
 
6 Habits To Make A Difference
6 Habits To Make A Difference6 Habits To Make A Difference
6 Habits To Make A Difference
 
งานสัปดาห์ที่3-กานต์รวี
งานสัปดาห์ที่3-กานต์รวีงานสัปดาห์ที่3-กานต์รวี
งานสัปดาห์ที่3-กานต์รวี
 
Week 4 blog
Week 4 blogWeek 4 blog
Week 4 blog
 
POWERSTRUCT Formal Capability Statement V1.0
POWERSTRUCT Formal Capability Statement V1.0POWERSTRUCT Formal Capability Statement V1.0
POWERSTRUCT Formal Capability Statement V1.0
 
אילת אילות אנרפוינט ישראל
אילת אילות אנרפוינט ישראלאילת אילות אנרפוינט ישראל
אילת אילות אנרפוינט ישראל
 
36 Motivational Business Quotes to Keep you Going!
36 Motivational Business Quotes to Keep you Going!36 Motivational Business Quotes to Keep you Going!
36 Motivational Business Quotes to Keep you Going!
 
Child Well Being Assessment
Child Well Being Assessment Child Well Being Assessment
Child Well Being Assessment
 
Interesting facts about the grand canyon
Interesting facts about the grand canyonInteresting facts about the grand canyon
Interesting facts about the grand canyon
 
Lanita Lasenberry Radiography Resume
Lanita Lasenberry Radiography ResumeLanita Lasenberry Radiography Resume
Lanita Lasenberry Radiography Resume
 
A Global Perspective of Intensification in relation to food security and clim...
A Global Perspective of Intensification in relation to food security and clim...A Global Perspective of Intensification in relation to food security and clim...
A Global Perspective of Intensification in relation to food security and clim...
 
sales
salessales
sales
 
Independent healthcare in-house lawyers forum
Independent healthcare in-house lawyers forumIndependent healthcare in-house lawyers forum
Independent healthcare in-house lawyers forum
 
Of giant ferns and tiny prayer temples - Marianne Esders
Of giant ferns and tiny prayer temples - Marianne EsdersOf giant ferns and tiny prayer temples - Marianne Esders
Of giant ferns and tiny prayer temples - Marianne Esders
 
Deel communicatiebudget FOD Justitie ging naar hotel
Deel communicatiebudget FOD Justitie ging naar hotelDeel communicatiebudget FOD Justitie ging naar hotel
Deel communicatiebudget FOD Justitie ging naar hotel
 
The Power of Crowds
The Power of CrowdsThe Power of Crowds
The Power of Crowds
 
Emocionalna inteligencija
Emocionalna inteligencijaEmocionalna inteligencija
Emocionalna inteligencija
 
So You Think You Can Tweet....in Multiple Languages
So You Think You Can Tweet....in Multiple LanguagesSo You Think You Can Tweet....in Multiple Languages
So You Think You Can Tweet....in Multiple Languages
 
Mesure de l' administration communale des Cayes
Mesure de l' administration communale des CayesMesure de l' administration communale des Cayes
Mesure de l' administration communale des Cayes
 

Similaire à A hitchhiker’s guide to neuroevolution in Erlang

Neural network
Neural networkNeural network
Neural networkmarada0033
 
Deep learning simplified
Deep learning simplifiedDeep learning simplified
Deep learning simplifiedLovelyn Rose
 
Machine Learning.pdf
Machine Learning.pdfMachine Learning.pdf
Machine Learning.pdfnikola_tesla1
 
Neural_N_Problems - SLP.pdf
Neural_N_Problems - SLP.pdfNeural_N_Problems - SLP.pdf
Neural_N_Problems - SLP.pdf8dunderground
 
Neural Network - Feed Forward - Back Propagation Visualization
Neural Network - Feed Forward - Back Propagation VisualizationNeural Network - Feed Forward - Back Propagation Visualization
Neural Network - Feed Forward - Back Propagation VisualizationTraian Morar
 
Data mining assignment 5
Data mining assignment 5Data mining assignment 5
Data mining assignment 5BarryK88
 
Introduction to neural networks
Introduction to neural networks Introduction to neural networks
Introduction to neural networks Ahmad Hammoudeh
 
Perceptron 2015.ppt
Perceptron 2015.pptPerceptron 2015.ppt
Perceptron 2015.pptSadafAyesha9
 
Introduction to Artificial Neural Network
Introduction to Artificial Neural Network Introduction to Artificial Neural Network
Introduction to Artificial Neural Network Qingkai Kong
 
Backpropagation Algorithm forward and backward pass
Backpropagation Algorithm forward and backward passBackpropagation Algorithm forward and backward pass
Backpropagation Algorithm forward and backward passssuser7e0a0e
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptronsmitamm
 
Convolution Neural Network
Convolution Neural NetworkConvolution Neural Network
Convolution Neural NetworkAmit Kushwaha
 
Introduction to Neural networks (under graduate course) Lecture 4 of 9
Introduction to Neural networks (under graduate course) Lecture 4 of 9Introduction to Neural networks (under graduate course) Lecture 4 of 9
Introduction to Neural networks (under graduate course) Lecture 4 of 9Randa Elanwar
 
Neural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmNeural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmMartin Opdam
 

Similaire à A hitchhiker’s guide to neuroevolution in Erlang (20)

Neural network
Neural networkNeural network
Neural network
 
Artificial Neural Networks
Artificial Neural NetworksArtificial Neural Networks
Artificial Neural Networks
 
Deep learning simplified
Deep learning simplifiedDeep learning simplified
Deep learning simplified
 
Machine Learning.pdf
Machine Learning.pdfMachine Learning.pdf
Machine Learning.pdf
 
Neural Networks
Neural NetworksNeural Networks
Neural Networks
 
CS767_Lecture_04.pptx
CS767_Lecture_04.pptxCS767_Lecture_04.pptx
CS767_Lecture_04.pptx
 
Neural_N_Problems - SLP.pdf
Neural_N_Problems - SLP.pdfNeural_N_Problems - SLP.pdf
Neural_N_Problems - SLP.pdf
 
Neural Network - Feed Forward - Back Propagation Visualization
Neural Network - Feed Forward - Back Propagation VisualizationNeural Network - Feed Forward - Back Propagation Visualization
Neural Network - Feed Forward - Back Propagation Visualization
 
Data mining assignment 5
Data mining assignment 5Data mining assignment 5
Data mining assignment 5
 
Introduction to neural networks
Introduction to neural networks Introduction to neural networks
Introduction to neural networks
 
Perceptron 2015.ppt
Perceptron 2015.pptPerceptron 2015.ppt
Perceptron 2015.ppt
 
Introduction to Artificial Neural Network
Introduction to Artificial Neural Network Introduction to Artificial Neural Network
Introduction to Artificial Neural Network
 
SOFTCOMPUTERING TECHNICS - Unit
SOFTCOMPUTERING TECHNICS - UnitSOFTCOMPUTERING TECHNICS - Unit
SOFTCOMPUTERING TECHNICS - Unit
 
Backpropagation Algorithm forward and backward pass
Backpropagation Algorithm forward and backward passBackpropagation Algorithm forward and backward pass
Backpropagation Algorithm forward and backward pass
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
 
Convolution Neural Network
Convolution Neural NetworkConvolution Neural Network
Convolution Neural Network
 
Jst part3
Jst part3Jst part3
Jst part3
 
DNN.pptx
DNN.pptxDNN.pptx
DNN.pptx
 
Introduction to Neural networks (under graduate course) Lecture 4 of 9
Introduction to Neural networks (under graduate course) Lecture 4 of 9Introduction to Neural networks (under graduate course) Lecture 4 of 9
Introduction to Neural networks (under graduate course) Lecture 4 of 9
 
Neural Network Back Propagation Algorithm
Neural Network Back Propagation AlgorithmNeural Network Back Propagation Algorithm
Neural Network Back Propagation Algorithm
 

Plus de Thoughtworks

Design System as a Product
Design System as a ProductDesign System as a Product
Design System as a ProductThoughtworks
 
Designers, Developers & Dogs
Designers, Developers & DogsDesigners, Developers & Dogs
Designers, Developers & DogsThoughtworks
 
Cloud-first for fast innovation
Cloud-first for fast innovationCloud-first for fast innovation
Cloud-first for fast innovationThoughtworks
 
More impact with flexible teams
More impact with flexible teamsMore impact with flexible teams
More impact with flexible teamsThoughtworks
 
Culture of Innovation
Culture of InnovationCulture of Innovation
Culture of InnovationThoughtworks
 
Developer Experience
Developer ExperienceDeveloper Experience
Developer ExperienceThoughtworks
 
When we design together
When we design togetherWhen we design together
When we design togetherThoughtworks
 
Hardware is hard(er)
Hardware is hard(er)Hardware is hard(er)
Hardware is hard(er)Thoughtworks
 
Customer-centric innovation enabled by cloud
 Customer-centric innovation enabled by cloud Customer-centric innovation enabled by cloud
Customer-centric innovation enabled by cloudThoughtworks
 
Amazon's Culture of Innovation
Amazon's Culture of InnovationAmazon's Culture of Innovation
Amazon's Culture of InnovationThoughtworks
 
When in doubt, go live
When in doubt, go liveWhen in doubt, go live
When in doubt, go liveThoughtworks
 
Don't cross the Rubicon
Don't cross the RubiconDon't cross the Rubicon
Don't cross the RubiconThoughtworks
 
Your test coverage is a lie!
Your test coverage is a lie!Your test coverage is a lie!
Your test coverage is a lie!Thoughtworks
 
Docker container security
Docker container securityDocker container security
Docker container securityThoughtworks
 
Redefining the unit
Redefining the unitRedefining the unit
Redefining the unitThoughtworks
 
Technology Radar Webinar UK - Vol. 22
Technology Radar Webinar UK - Vol. 22Technology Radar Webinar UK - Vol. 22
Technology Radar Webinar UK - Vol. 22Thoughtworks
 
A Tribute to Turing
A Tribute to TuringA Tribute to Turing
A Tribute to TuringThoughtworks
 
Rsa maths worked out
Rsa maths worked outRsa maths worked out
Rsa maths worked outThoughtworks
 

Plus de Thoughtworks (20)

Design System as a Product
Design System as a ProductDesign System as a Product
Design System as a Product
 
Designers, Developers & Dogs
Designers, Developers & DogsDesigners, Developers & Dogs
Designers, Developers & Dogs
 
Cloud-first for fast innovation
Cloud-first for fast innovationCloud-first for fast innovation
Cloud-first for fast innovation
 
More impact with flexible teams
More impact with flexible teamsMore impact with flexible teams
More impact with flexible teams
 
Culture of Innovation
Culture of InnovationCulture of Innovation
Culture of Innovation
 
Dual-Track Agile
Dual-Track AgileDual-Track Agile
Dual-Track Agile
 
Developer Experience
Developer ExperienceDeveloper Experience
Developer Experience
 
When we design together
When we design togetherWhen we design together
When we design together
 
Hardware is hard(er)
Hardware is hard(er)Hardware is hard(er)
Hardware is hard(er)
 
Customer-centric innovation enabled by cloud
 Customer-centric innovation enabled by cloud Customer-centric innovation enabled by cloud
Customer-centric innovation enabled by cloud
 
Amazon's Culture of Innovation
Amazon's Culture of InnovationAmazon's Culture of Innovation
Amazon's Culture of Innovation
 
When in doubt, go live
When in doubt, go liveWhen in doubt, go live
When in doubt, go live
 
Don't cross the Rubicon
Don't cross the RubiconDon't cross the Rubicon
Don't cross the Rubicon
 
Error handling
Error handlingError handling
Error handling
 
Your test coverage is a lie!
Your test coverage is a lie!Your test coverage is a lie!
Your test coverage is a lie!
 
Docker container security
Docker container securityDocker container security
Docker container security
 
Redefining the unit
Redefining the unitRedefining the unit
Redefining the unit
 
Technology Radar Webinar UK - Vol. 22
Technology Radar Webinar UK - Vol. 22Technology Radar Webinar UK - Vol. 22
Technology Radar Webinar UK - Vol. 22
 
A Tribute to Turing
A Tribute to TuringA Tribute to Turing
A Tribute to Turing
 
Rsa maths worked out
Rsa maths worked outRsa maths worked out
Rsa maths worked out
 

Dernier

MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotesMuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotesManik S Magar
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch TuesdayIvanti
 
Data governance with Unity Catalog Presentation
Data governance with Unity Catalog PresentationData governance with Unity Catalog Presentation
Data governance with Unity Catalog PresentationKnoldus Inc.
 
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxGenerative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxfnnc6jmgwh
 
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...itnewsafrica
 
Top 10 Hubspot Development Companies in 2024
Top 10 Hubspot Development Companies in 2024Top 10 Hubspot Development Companies in 2024
Top 10 Hubspot Development Companies in 2024TopCSSGallery
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI AgeCprime
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...Wes McKinney
 
QCon London: Mastering long-running processes in modern architectures
QCon London: Mastering long-running processes in modern architecturesQCon London: Mastering long-running processes in modern architectures
QCon London: Mastering long-running processes in modern architecturesBernd Ruecker
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfLoriGlavin3
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesThousandEyes
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxLoriGlavin3
 
A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersNicole Novielli
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfpanagenda
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
Design pattern talk by Kaya Weers - 2024 (v2)
Design pattern talk by Kaya Weers - 2024 (v2)Design pattern talk by Kaya Weers - 2024 (v2)
Design pattern talk by Kaya Weers - 2024 (v2)Kaya Weers
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPathCommunity
 

Dernier (20)

MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotesMuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
MuleSoft Online Meetup Group - B2B Crash Course: Release SparkNotes
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch Tuesday
 
Data governance with Unity Catalog Presentation
Data governance with Unity Catalog PresentationData governance with Unity Catalog Presentation
Data governance with Unity Catalog Presentation
 
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxGenerative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
 
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
 
Top 10 Hubspot Development Companies in 2024
Top 10 Hubspot Development Companies in 2024Top 10 Hubspot Development Companies in 2024
Top 10 Hubspot Development Companies in 2024
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI Age
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
 
QCon London: Mastering long-running processes in modern architectures
QCon London: Mastering long-running processes in modern architecturesQCon London: Mastering long-running processes in modern architectures
QCon London: Mastering long-running processes in modern architectures
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdf
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
 
A Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software DevelopersA Journey Into the Emotions of Software Developers
A Journey Into the Emotions of Software Developers
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
Design pattern talk by Kaya Weers - 2024 (v2)
Design pattern talk by Kaya Weers - 2024 (v2)Design pattern talk by Kaya Weers - 2024 (v2)
Design pattern talk by Kaya Weers - 2024 (v2)
 
UiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to HeroUiPath Community: Communication Mining from Zero to Hero
UiPath Community: Communication Mining from Zero to Hero
 

A hitchhiker’s guide to neuroevolution in Erlang

  • 1. J e r o e n S o e t e r s NEUROEVOLUTION A hitchhiker’s guide to neuroevolution in Erlang.
  • 3. WHAT IS MACHINE LEARNING? • Artificial Neural Networks • Genetic Algorithms 3
  • 6. A MODEL FOR A NEURON 6 Y w1 w2 wn x1 x2 xn Dendrites Synapses Axon Soma
  • 7. A MODEL FOR A NEURON 6 Y w1 w2 wn x1 x2 xn Input signals Weights Output signal Neuron
  • 8. HOW DOES THE NEURON DETERMINE IT’S OUTPUT? Y =sign ∑xiwi - ⍬ 7 n=1 n
  • 11. PERCEPTRON LEARNING RULE ℮(p) = Yd(p) - Y(p) wi(p + 1) = wi(p) + α • wi(p) • ℮(p) = 10
  • 12. PERCEPTRON TRAINING ALGORITHM 11 weight training start stop weights converged? yes no set weights and threshold to random values [-0.5, 0.5] activate the perceptron
  • 13. LOGIC GATES 12 input values x1 x2 x1 AND x2 x1 OR x2 x1 XOR x2 0 0 0 0 0 0 1 0 1 1 1 0 0 1 1 1 1 1 1 0
  • 14. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
  • 15. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
  • 16. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1=
  • 17. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.3 -0.1
  • 18. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0 0.3
  • 19. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0 -0.1
  • 20. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0
  • 21. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 00
  • 22. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0
  • 23. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.3 -0.1
  • 24. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0 0.3
  • 25. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 1 -0.1
  • 26. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0
  • 27. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 00
  • 28. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0
  • 29. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.3 -0.1
  • 30. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 1 0.3
  • 31. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0 -0.1
  • 32. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 1
  • 33. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 10
  • 34. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= -1
  • 35. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.2 -0.1
  • 36. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 1 0.2
  • 37. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 1 -0.1
  • 38. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0
  • 39. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 01
  • 40. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 1
  • 41. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 13 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 1 0 0 0 0.3 -0.1 0 0 0.3 -0.1 0 1 0 0.3 -0.1 0 0 0.3 -0.1 1 0 0 0.3 -0.1 1 -1 0.2 -0.1 1 1 1 0.2 -0.1 0 1 0.3 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.3 0.0
  • 42. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 14 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 2 0 0 0 0.3 0.0 0 0 0.3 0.0 0 1 0 0.3 0.0 0 0 0.3 0.0 1 0 0 0.3 0.0 1 -1 0.2 0.0 1 1 1 0.2 0.0 1 0 0.2 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.3 0.0
  • 43. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 14 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 2 0 0 0 0.3 0.0 0 0 0.3 0.0 0 1 0 0.3 0.0 0 0 0.3 0.0 1 0 0 0.3 0.0 1 -1 0.2 0.0 1 1 1 0.2 0.0 1 0 0.2 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.3 0.0
  • 44. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 14 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 2 0 0 0 0.3 0.0 0 0 0.3 0.0 0 1 0 0.3 0.0 0 0 0.3 0.0 1 0 0 0.3 0.0 1 -1 0.2 0.0 1 1 1 0.2 0.0 1 0 0.2 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.2 0.0
  • 45. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 14 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 2 0 0 0 0.3 0.0 0 0 0.3 0.0 0 1 0 0.3 0.0 0 0 0.3 0.0 1 0 0 0.3 0.0 1 -1 0.2 0.0 1 1 1 0.2 0.0 1 0 0.2 0.0 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.2 0.0
  • 46. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 15 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 3 0 0 0 0.2 0.0 0 0 0.2 0.0 0 1 0 0.2 0.0 0 0 0.2 0.0 1 0 0 0.2 0.0 1 -1 0.1 0.0 1 1 1 0.1 0.0 0 1 0.2 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.2 0.0
  • 47. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 15 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 3 0 0 0 0.2 0.0 0 0 0.2 0.0 0 1 0 0.2 0.0 0 0 0.2 0.0 1 0 0 0.2 0.0 1 -1 0.1 0.0 1 1 1 0.1 0.0 0 1 0.2 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.2 0.0
  • 48. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 15 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 3 0 0 0 0.2 0.0 0 0 0.2 0.0 0 1 0 0.2 0.0 0 0 0.2 0.0 1 0 0 0.2 0.0 1 -1 0.1 0.0 1 1 1 0.1 0.0 0 1 0.2 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.1 0.0
  • 49. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 15 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 3 0 0 0 0.2 0.0 0 0 0.2 0.0 0 1 0 0.2 0.0 0 0 0.2 0.0 1 0 0 0.2 0.0 1 -1 0.1 0.0 1 1 1 0.1 0.0 0 1 0.2 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.2 0.1
  • 50. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 16 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 4 0 0 0 0.2 0.1 0 0 0.2 0.1 0 1 0 0.2 0.1 0 0 0.2 0.1 1 0 0 0.2 0.1 1 -1 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.2 0.1
  • 51. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 16 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 4 0 0 0 0.2 0.1 0 0 0.2 0.1 0 1 0 0.2 0.1 0 0 0.2 0.1 1 0 0 0.2 0.1 1 -1 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.2 0.1
  • 52. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 16 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 4 0 0 0 0.2 0.1 0 0 0.2 0.1 0 1 0 0.2 0.1 0 0 0.2 0.1 1 0 0 0.2 0.1 1 -1 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.1 0.1
  • 53. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 16 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 4 0 0 0 0.2 0.1 0 0 0.2 0.1 0 1 0 0.2 0.1 0 0 0.2 0.1 1 0 0 0.2 0.1 1 -1 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.1 0.1
  • 54. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 17 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 5 0 0 0 0.1 0.1 0 0 0.1 0.1 0 1 0 0.1 0.1 0 0 0.1 0.1 1 0 0 0.1 0.1 0 0 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.1 0.1
  • 55. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 17 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 5 0 0 0 0.1 0.1 0 0 0.1 0.1 0 1 0 0.1 0.1 0 0 0.1 0.1 1 0 0 0.1 0.1 0 0 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.1 0.1
  • 56. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 17 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 5 0 0 0 0.1 0.1 0 0 0.1 0.1 0 1 0 0.1 0.1 0 0 0.1 0.1 1 0 0 0.1 0.1 0 0 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.1 0.1
  • 57. TRAINING A PERCEPTRON TO PERFORM THE AND OPERATION 17 epoch inputs x1 x2 desired output Yd initial weights w1 w2 actual output Y error ℮ final weights w1 w2 5 0 0 0 0.1 0.1 0 0 0.1 0.1 0 1 0 0.1 0.1 0 0 0.1 0.1 1 0 0 0.1 0.1 0 0 0.1 0.1 1 1 1 0.1 0.1 1 0 0.1 0.1 Threshold: ⍬ = 0.2 ; learning rate: α = 0.1= 0.1 0.1
  • 58. A LITTLE GEOMETRY… 18 0 1 1 x2 x1 0 1 1 x2 x10 1 1 x2 x1 x1 AND x2 x1 OR x2 x1 XOR x2
  • 59. WE NEED MORE LAYERS 19 3 4 5 1 2 Input layer Hidden layer Output layer x1 x2 Y
  • 60. PROBLEMS WITH BACK PROPAGATION •a training set of sufficient size is required •topology of the network needs to be known in advance •no recurrent connections are allowed •activation function must be differentiable Does not emulate the biological world 20
  • 64. 1 1000 11 0 CROSSOVER 24 0 111 0 11 0 1 100 1 10 0 parents
  • 65. 1 1000 11 0 CROSSOVER 24 0 111 0 11 0 1 100 1 10 0 ✂ parents ✂
  • 66. 1 1000 11 0 1 10 00 111 offspring CROSSOVER 24 0 111 1 10 0 ✂ parents ✂
  • 68. MUTATION 25 A DA D10 11 101 01 0
  • 69. MUTATION 25 A DA D10 11 101 01 00 1
  • 70. EVOLUTIONARY ALGORITHM •represent the candidate solution as a chromosome •chose the initial population size N, crossover probability (Pc) and mutation probability (Pm) •define a fitness function to measure the performance of the chromosome •define the genetic operators for the chromosome 26
  • 71. EVOLUTIONARY ALGORITHM 27 start stop generate a population calculate fitness termination criteria satisfied? yes no new population size = N? crossover and mutation select pair for mating add to new population replace population no yes
  • 73. EVOLUTIONARY ALGORITHM •represent the candidate solution as a chromosome •define a fitness function to measure the performance of the chromosome •define the genetic operators for the chromosome •chose the initial population size N, crossover probability Pc and mutation probability Pm 29
  • 75. EVOLUTIONARY ALGORITHM •represent the candidate solution as a chromosome •define a fitness function to measure the performance of the chromosome •define the genetic operators for the chromosome •chose the initial population size N, crossover probability Pc and mutation probability Pm 31
  • 76. FITNESS FUNCTION Fitness = 1 / total distance 32
  • 77. EVOLUTIONARY ALGORITHM •represent the candidate solution as a chromosome •define a fitness function to measure the performance of the chromosome •define the genetic operators for the chromosome •chose the initial population size N, crossover probability Pc and mutation probability Pm 33
  • 78. CROSSOVER 34 G EBC H FA D A FEB H DC G parents
  • 79. CROSSOVER 34 G EBC H FA D A FEB H DC G ✂ ✂ parents
  • 80. H A D offspring CROSSOVER 34 G EBC H FA D A FEB H DC G ✂ ✂ parents
  • 81. H A D offspring CROSSOVER 34 G EBC H FA D A FEB H DC G ✂ ✂ parents B
  • 82. H A D offspring CROSSOVER 34 G EBC H FA D A FEB H DC G ✂ ✂ parents B E
  • 83. H A D offspring CROSSOVER 34 G EBC H FA D A FEB H DC G ✂ ✂ parents B E F
  • 84. H A D offspring CROSSOVER 34 G EBC H FA D A FEB H DC G ✂ ✂ parents B E F C
  • 85. H A D offspring CROSSOVER 34 G EBC H FA D A FEB H DC G ✂ ✂ parents B E F C G
  • 89. EVOLUTIONARY ALGORITHM •represent the candidate solution as a chromosome •define a fitness function to measure the performance of the chromosome •define the genetic operators for the chromosome •chose the initial population size N, crossover probability Pc and mutation probability Pm 36
  • 97. AND LOTS MORE… •data compression •training NPCs in a video game •cyber warfare 43
  • 98. EVOLUTIONARY ALGORITHM •represent the candidate solution as a chromosome •define a fitness function to measure the performance of the chromosome •define the genetic operators for the chromosome •chose the initial population size N, crossover probability Pc and mutation probability Pm 44
  • 100. EVOLUTIONARY ALGORITHM •represent the candidate solution as a chromosome •define a fitness function to measure the performance of the chromosome •define the genetic operators for the chromosome •chose the initial population size N, crossover probability Pc and mutation probability Pm 46
  • 101. FITNESS FUNCTION Fitness = performance of network on an actual problem 47
  • 102. EVOLUTIONARY ALGORITHM •represent the candidate solution as a chromosome •define a fitness function to measure the performance of the chromosome •define the genetic operators for the chromosome •chose the initial population size N, crossover probability Pc and mutation probability Pm 48
  • 103. CROSSOVER Crossover doesn’t work for large neural nets! 49
  • 113. EVOLUTIONARY ALGORITHM •represent the candidate solution as a chromosome •define a fitness function to measure the performance of the chromosome •define the genetic operators for the chromosome •chose the initial population size N, crossover probability Pc and mutation probability Pm 55
  • 114. RANDOM IMPACT MUTATION Number of mutations = random(1, network size) 56
  • 115. MEMETIC ALGORITHM 57 apply to a problem start generate a population local search: Hill Climber calculate effective fitness select fit organisms create offspring
  • 116. STOCHASTIC HILL CLIMBER (LOCAL SEARCH) 58 start new fitness > old fitness? yes no stopping condition reached? stop apply NN to a problem backup and perturb weights restore backed-up weights
  • 117. A LANGUAGE FOR NEUROEVOLUTION •The system must be able to handle very large numbers of concurrent activities •Actions must be performed at a certain point in time or within a certain time •Systems may be distributed over several computers •The system is used to control hardware •The software systems are very large 59
  • 118. A LANGUAGE FOR NEUROEVOLUTION 59 •The system exhibits complex functionality such as, feature interaction. •The systems should be in continuous operation for many years. •Software maintenance (reconfiguration, etc) should be performed without stopping the system. •There are stringent quality, and reliability requirements. •Fault tolerance
  • 119. A LANGUAGE FOR NEUROEVOLUTION 59 •The system exhibits complex functionality such as, feature interaction. •The systems should be in continuous operation for many years. •Software maintenance (reconfiguration, etc) should be performed without stopping the system. •There are stringent quality, and reliability requirements. •Fault tolerance Bjarne Dacker. Erlang - A New Programming Language. Ericsson Review, no 2, 1993.
  • 121. 1:1 MAPPING 61 neuron neuronneuron neuron process process process process processprocess genotype erlang
  • 141. THE POPULATION MONITOR 64 population monitor database private private privateprivate private
  • 142. THE POPULATION MONITOR 64 population monitor database private private privateprivate private start start start start start
  • 143. THE POPULATION MONITOR 64 population monitor database private private privateprivate private
  • 144. THE POPULATION MONITOR 64 population monitor database private private privateprivate private terminated terminated terminated terminated terminated
  • 145. THE POPULATION MONITOR 64 population monitor database private private privateprivate private
  • 146. THE POPULATION MONITOR 64 population monitor database private privateprivate
  • 147. THE POPULATION MONITOR 64 population monitor database private privateprivate private private
  • 148. THE POPULATION MONITOR 64 population monitor database private privateprivate private private start start start start start
  • 149. THE DEVIL IS IN THE DETAILS •recurrent connections •newer generations get a higher chance for mutation •neural plasticity •public scapes and steady state evolution 65
  • 153. BENCHMARK RESULTS 68 Method Single-Pole/Incomplete state information Double-Pole/Partial Information W/O Damping Double-Pole W/Damping RWG 8557 415209 1232296 SANE 1212 262700 451612 CNE* 724 76906* 87623* ESP 589 7374 26342 NEAT - - 6929 CMA-ES* - 3521* 6061* CoSyNE* 127* 1249* 3416* DXNN not performed 2359 2313 Our System 647 5184 4792
  • 155. Feel free to reach out at: e. jsoeters@thoughtworks.com t. @JeroenSoeters THANK YOU
  • 156. DATA SCIENCE AND ENGINEERING 71