Skip to content

Commit f2614ad

Browse files
authored
Merge pull request #15 from jwblangley/documentation
Documentation
2 parents 143f525 + 812eba8 commit f2614ad

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

72 files changed

+12701
-18
lines changed

Diff for: README.md

+130-9
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,21 @@
11
# neat-ml
22
An implementation of the NEAT (Neuroevolution through augmenting topologies) algorithm in Java. Originally found at http://nn.cs.utexas.edu/downloads/papers/stanley.ec02.pdf
33

4+
## Features
5+
* *Fully* documented code base ideal for learning and people new to the subject
6+
* Super friendly highly abstracated interfaces that hide the implementation details so you can focus on your use case
7+
* Friendly implementations of both genotypes and phenotypes
8+
* All you need to worry about is inputs, outputs and fitness!
9+
* Multi-threading support for concurrent genotype evaluation just by specifying the number of worker threads
10+
* Neural network visualiser to create images so you can see what networks are being created
11+
* Full serializing and deserializing support using [google's protobuf](https://developers.google.com/protocol-buffers), allowing you to save interesting genotypes (networks) or even save your training progress to disk!
12+
413
## Quick start
514
This library has been built with the intention of making it as easy as possible to add NEAT to any project - without the need for deep understanding of how NEAT works.
615

716
Take the following quick example that shows how to use this library to learn the XOR function:
817

9-
**TLDR**
18+
**TLDR**
1019
* a little bit of config
1120
* create evolution (all the hard work has been done for you!)
1221
* Write the evaluator for a genotype
@@ -34,17 +43,17 @@ public class LearnXor {
3443
final int populationSize = 100;
3544
final int targetNumSpecies = 5;
3645
final int numProcessingThreads = 8;
37-
46+
3847
// Create a new optimisation with 2 inputs and 1 output
39-
Evolution evolution = EvolutionFactory.createOptimisation(2, 1, populationSize,
48+
Evolution evolution = EvolutionFactory.createOptimisation(2, 1, populationSize,
4049
targetNumSpecies, numProcessingThreads, networkGenotype -> {
41-
50+
4251
// Build a neural network from the genotype. This is all done for you with this method!
4352
// There is an equivalent method for building linear output networks
4453
Network network = Network.createSigmoidOutputNetworkFromGenotype(networkGenotype);
4554

4655
// Now that a network has been created, evaluate it however you see fit!
47-
// To prevent over-fitting to a particular problem, we want to actually evaluate it
56+
// To prevent over-fitting to a particular problem, we want to actually evaluate it
4857
// several times and aggregate a score (For XOR there are only 4 possible combinations,
4958
// but this is a good habit to get into for the general case).
5059
// N.B: If you are trying to solve a non-generalised problem, this is not needed
@@ -58,15 +67,15 @@ public class LearnXor {
5867
// We get the 0th index item as we want the first (and in this case only) output
5968
final double output = network.calculateOutputs(a, b).get(0);
6069

61-
// Using "> 0.5" here is a good way to turn a sigmoid output (0-1) into a binary output!
70+
// Using "> 0.5" here is a good way to turn a sigmoid output (0-1) into a binary output!
6271
final boolean expected = (a > 0.5) ^ (b > 0.5);
6372
final boolean actual = output > 0.5;
6473

6574
if (expected == actual) {
6675
numCorrect++;
6776
}
6877
}
69-
78+
7079
// Return a score that tells the program how well the network did! Higher is better!
7180
return numCorrect * 100d / testsInEvaluate;
7281
});
@@ -83,7 +92,7 @@ public class LearnXor {
8392
// After the evolution is done (you will need to experiment to know how much training you need)
8493
// Get the best genotype from the population
8594
NetworkGenotype bestInPop = evolution.getFittestGenotype();
86-
95+
8796
// Create the network from this genotype as we did before
8897
Network bestNetwork = Network.createSigmoidOutputNetworkFromGenotype(bestInPop);
8998

@@ -105,5 +114,117 @@ public class LearnXor {
105114
// true XOR true = false
106115

107116
}
108-
}
117+
}
118+
```
119+
120+
### Visualiser
121+
122+
Visalising a neural network is a great way to understand a bit about what is going on and it also looks great!
123+
The visualiser in this library has the following features:
124+
* Different colours for positive and negative weights
125+
* Orange for positive
126+
* Purple for negative
127+
* Different thicknesses based on connection weight
128+
* Thicker connections indicate greater (absolute) weights
129+
* Tapers to show direction
130+
* Each connection is tapered from thicker to thinner to indicate connection direction
131+
132+
Visualising a network is as simple as:
133+
```java
134+
// network is genotype - for example from evolution.getFittestGenotype()
135+
BufferedImage image = Visualiser.visualiseNetwork(network);
136+
```
137+
138+
You are then free to handle the standard java `BufferedImage` as you see fit. However, if you want to write the image to you disk you can utilise this library's function:
139+
```java
140+
Visualiser.saveImageToFile(image, new File("network.png"), true);
141+
```
142+
143+
This can achieve results like this:
144+
![xor-small](xor-small.png)
145+
![xor-large](xor-large.png)
146+
147+
### Serialise
148+
#### Networks
149+
You can save a particular network genotype for later use. This could be so that you can keep track of the best/most interesting individuals and have them evaluate without the need for training. This is a great utility that has many uses: for example embedding in games as the AI!
150+
151+
To write a network genotype to a file:
152+
```java
153+
ProtoIO.toFile(networkGenotype, new File("network.geno"));
154+
```
155+
156+
To read a network genotype from a file:
157+
```java
158+
NetworkGenotype fromProto = ProtoIO.networkFromFile(new File("network.geno"))
109159
```
160+
161+
Note that these networks are genotypes, not phenotypes. You will need to create them *in the same way as you did when training* before you can use them:
162+
```java
163+
// Or linear output!
164+
Network network = Network.createSigmoidOutputNetworkFromGenotype(fromProto);
165+
166+
network.calculateOutputs(/* inputs here */);
167+
```
168+
169+
#### Evolution
170+
With this library comes the very powerful ability to stop training (evolving), save your progress and resume training again at a later date without losing progress.
171+
172+
```java
173+
Evaluator evaluator = networkGenotype -> {
174+
/* evaluate here */
175+
return /* fitness */;
176+
};
177+
178+
int populationSize = 100;
179+
int targetNumSpecies = 5;
180+
int numThreads = 8;
181+
182+
Evolution evolution = EvolutionFactory.createOptimisation(
183+
/* numInputs*/,
184+
/* numOutputs */,
185+
populationSize,
186+
targetNumSpecies,
187+
numThreads,
188+
evaluator);
189+
190+
for (int i = 0; i < 100; i++) {
191+
evolution.evolve();
192+
}
193+
194+
/*
195+
.
196+
.
197+
.
198+
Get interrupted here
199+
.
200+
.
201+
.
202+
*/
203+
204+
// Write to disk
205+
ProtoIO.toFile(evolution, new File("evolution.evo"));
206+
207+
/*
208+
.
209+
.
210+
.
211+
Want to resume here
212+
.
213+
.
214+
.
215+
*/
216+
217+
Evolution loadedEvolution = ProtoIO.evolutionFromFile(
218+
new File("evolution.evo");
219+
targetNumSpecies,
220+
numThreads,
221+
evaluator
222+
);
223+
224+
// Continue evolution
225+
for (int i = 0; i < 100; i++) {
226+
loadedEvolution.evolve();
227+
}
228+
```
229+
230+
This does actually give you the ability to define a new Evaluator (and the other params passed into `evolutionFromFile`) midway through training. This is normally not recommended, but can actually be very useful for some incremental learning techniques.

Diff for: docs/allclasses-frame.html

+36
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
2+
<!-- NewPage -->
3+
<html lang="en">
4+
<head>
5+
<!-- Generated by javadoc (1.8.0_241) on Sat Jun 13 17:53:20 BST 2020 -->
6+
<title>All Classes</title>
7+
<meta name="date" content="2020-06-13">
8+
<link rel="stylesheet" type="text/css" href="stylesheet.css" title="Style">
9+
<script type="text/javascript" src="script.js"></script>
10+
</head>
11+
<body>
12+
<h1 class="bar">All&nbsp;Classes</h1>
13+
<div class="indexContainer">
14+
<ul>
15+
<li><a href="jwblangley/neat/phenotype/Activation.html" title="enum in jwblangley.neat.phenotype" target="classFrame">Activation</a></li>
16+
<li><a href="jwblangley/neat/genotype/ConnectionGenotype.html" title="class in jwblangley.neat.genotype" target="classFrame">ConnectionGenotype</a></li>
17+
<li><a href="jwblangley/neat/util/DisjointExcess.html" title="class in jwblangley.neat.util" target="classFrame">DisjointExcess</a></li>
18+
<li><a href="jwblangley/neat/evolution/Evaluator.html" title="interface in jwblangley.neat.evolution" target="classFrame"><span class="interfaceName">Evaluator</span></a></li>
19+
<li><a href="jwblangley/neat/evolution/Evolution.html" title="class in jwblangley.neat.evolution" target="classFrame">Evolution</a></li>
20+
<li><a href="jwblangley/neat/evolution/EvolutionFactory.html" title="class in jwblangley.neat.evolution" target="classFrame">EvolutionFactory</a></li>
21+
<li><a href="jwblangley/neat/util/ImmutableHomogeneousPair.html" title="class in jwblangley.neat.util" target="classFrame">ImmutableHomogeneousPair</a></li>
22+
<li><a href="jwblangley/neat/evolution/InnovationGenerator.html" title="class in jwblangley.neat.evolution" target="classFrame">InnovationGenerator</a></li>
23+
<li><a href="jwblangley/neat/phenotype/InputNeuron.html" title="class in jwblangley.neat.phenotype" target="classFrame">InputNeuron</a></li>
24+
<li><a href="jwblangley/neat/phenotype/Network.html" title="class in jwblangley.neat.phenotype" target="classFrame">Network</a></li>
25+
<li><a href="jwblangley/neat/genotype/NetworkGenotype.html" title="class in jwblangley.neat.genotype" target="classFrame">NetworkGenotype</a></li>
26+
<li><a href="jwblangley/neat/phenotype/Neuron.html" title="class in jwblangley.neat.phenotype" target="classFrame">Neuron</a></li>
27+
<li><a href="jwblangley/neat/genotype/NeuronGenotype.html" title="class in jwblangley.neat.genotype" target="classFrame">NeuronGenotype</a></li>
28+
<li><a href="jwblangley/neat/genotype/NeuronLayer.html" title="enum in jwblangley.neat.genotype" target="classFrame">NeuronLayer</a></li>
29+
<li><a href="jwblangley/neat/proto/ProtoEquivalent.html" title="interface in jwblangley.neat.proto" target="classFrame"><span class="interfaceName">ProtoEquivalent</span></a></li>
30+
<li><a href="jwblangley/neat/proto/ProtoIO.html" title="class in jwblangley.neat.proto" target="classFrame">ProtoIO</a></li>
31+
<li><a href="jwblangley/neat/evolution/Species.html" title="class in jwblangley.neat.evolution" target="classFrame">Species</a></li>
32+
<li><a href="jwblangley/neat/visualiser/Visualiser.html" title="class in jwblangley.neat.visualiser" target="classFrame">Visualiser</a></li>
33+
</ul>
34+
</div>
35+
</body>
36+
</html>

Diff for: docs/allclasses-noframe.html

+36
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd">
2+
<!-- NewPage -->
3+
<html lang="en">
4+
<head>
5+
<!-- Generated by javadoc (1.8.0_241) on Sat Jun 13 17:53:20 BST 2020 -->
6+
<title>All Classes</title>
7+
<meta name="date" content="2020-06-13">
8+
<link rel="stylesheet" type="text/css" href="stylesheet.css" title="Style">
9+
<script type="text/javascript" src="script.js"></script>
10+
</head>
11+
<body>
12+
<h1 class="bar">All&nbsp;Classes</h1>
13+
<div class="indexContainer">
14+
<ul>
15+
<li><a href="jwblangley/neat/phenotype/Activation.html" title="enum in jwblangley.neat.phenotype">Activation</a></li>
16+
<li><a href="jwblangley/neat/genotype/ConnectionGenotype.html" title="class in jwblangley.neat.genotype">ConnectionGenotype</a></li>
17+
<li><a href="jwblangley/neat/util/DisjointExcess.html" title="class in jwblangley.neat.util">DisjointExcess</a></li>
18+
<li><a href="jwblangley/neat/evolution/Evaluator.html" title="interface in jwblangley.neat.evolution"><span class="interfaceName">Evaluator</span></a></li>
19+
<li><a href="jwblangley/neat/evolution/Evolution.html" title="class in jwblangley.neat.evolution">Evolution</a></li>
20+
<li><a href="jwblangley/neat/evolution/EvolutionFactory.html" title="class in jwblangley.neat.evolution">EvolutionFactory</a></li>
21+
<li><a href="jwblangley/neat/util/ImmutableHomogeneousPair.html" title="class in jwblangley.neat.util">ImmutableHomogeneousPair</a></li>
22+
<li><a href="jwblangley/neat/evolution/InnovationGenerator.html" title="class in jwblangley.neat.evolution">InnovationGenerator</a></li>
23+
<li><a href="jwblangley/neat/phenotype/InputNeuron.html" title="class in jwblangley.neat.phenotype">InputNeuron</a></li>
24+
<li><a href="jwblangley/neat/phenotype/Network.html" title="class in jwblangley.neat.phenotype">Network</a></li>
25+
<li><a href="jwblangley/neat/genotype/NetworkGenotype.html" title="class in jwblangley.neat.genotype">NetworkGenotype</a></li>
26+
<li><a href="jwblangley/neat/phenotype/Neuron.html" title="class in jwblangley.neat.phenotype">Neuron</a></li>
27+
<li><a href="jwblangley/neat/genotype/NeuronGenotype.html" title="class in jwblangley.neat.genotype">NeuronGenotype</a></li>
28+
<li><a href="jwblangley/neat/genotype/NeuronLayer.html" title="enum in jwblangley.neat.genotype">NeuronLayer</a></li>
29+
<li><a href="jwblangley/neat/proto/ProtoEquivalent.html" title="interface in jwblangley.neat.proto"><span class="interfaceName">ProtoEquivalent</span></a></li>
30+
<li><a href="jwblangley/neat/proto/ProtoIO.html" title="class in jwblangley.neat.proto">ProtoIO</a></li>
31+
<li><a href="jwblangley/neat/evolution/Species.html" title="class in jwblangley.neat.evolution">Species</a></li>
32+
<li><a href="jwblangley/neat/visualiser/Visualiser.html" title="class in jwblangley.neat.visualiser">Visualiser</a></li>
33+
</ul>
34+
</div>
35+
</body>
36+
</html>

0 commit comments

Comments
 (0)