Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit 8293118

Browse files
authoredMay 19, 2023
Contributing guide (#109)
* Begin work on a contributing guide * Small updates and link in README
1 parent d052bce commit 8293118

File tree

2 files changed

+90
-0
lines changed

2 files changed

+90
-0
lines changed
 

‎CONTRIBUTING.md

Lines changed: 84 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,84 @@
1+
# Contributing guide
2+
3+
This document describes the organization of the neural-fortran codebase to help
4+
guide the code contributors.
5+
6+
## Overall code organization
7+
8+
The source code organization follows the usual `fpm` convention:
9+
the library code is in [src/](src/), test programs are in [test/](test/),
10+
and example programs are in [example/](example/).
11+
12+
The top-level module that suggests the public, user-facing API is in
13+
[src/nf.f90](src/nf.f90).
14+
All other library source files are in [src/nf/](src/nf/).
15+
16+
Most of the library code defines interfaces in modules and implementations in
17+
submodules.
18+
If you want to know only about interfaces, in other words how to call procedures
19+
and what these procedures return, you can read just the module source files and
20+
not worry about the implementation.
21+
Then, if you want to know more about the implementation, you can find it in the
22+
appropriate source file that defines the submodule.
23+
Each library source file contains either one module or one submodule.
24+
The source files that define the submodule end with `_submodule.f90`.
25+
26+
## Components
27+
28+
Neural-fortran defines several components, described in a roughly top-down order:
29+
30+
* Networks
31+
* Layers
32+
- Layer constructor functions
33+
- Concrete layer implementations
34+
* Optimizers
35+
* Activation functions
36+
37+
### Networks
38+
39+
A network is the main component that the user works with,
40+
and the highest-level container in neural-fortran.
41+
A network is a collection of layers.
42+
43+
The network container is defined by the `network` derived type
44+
in the `nf_network` module, in the [nf_network.f90](src/nf/nf_network.f90)
45+
source file.
46+
47+
In a nutshell, the `network` type defines an allocatable array of `type(layer)`
48+
instances, and several type-bound methods for training and inference.
49+
50+
### Layers
51+
52+
Layers are the main building blocks of neural-fortran and neural networks in
53+
general.
54+
There is a common, high-level layer type that maintains the data flow
55+
in and out and calls the specific layer implementations of forward and backward
56+
pass methods.
57+
58+
When introducing a new layer type, study how the [dense](src/nf/nf_dense_layer.f90)
59+
or [convolutional](src/nf/nf_conv2d_layer.f90) concrete types are defined and
60+
implemented in their respective submodules.
61+
You will also need to follow the same use pattern in the
62+
[high-level layer type](src/nf/nf_layer.f90) and its corresponding submodule.
63+
64+
### Optimizers
65+
66+
Optimizers are the algorithms that determine how the model parameters are
67+
updated during training.
68+
69+
Optimizers are currently implmented in the [nf_optimizers.f90](src/nf/nf_optimizers.f90)
70+
source file and corresponding module.
71+
An optimizer instance is passed to the network at the `network % train()` call
72+
site.
73+
74+
### Activation functions
75+
76+
Activation functions and their derivatives are defined in the
77+
[nf_activation.f90](src/nf/nf_activation.f90) source file and corresponding
78+
types.
79+
They are implemented using a base activation abstract type and concrete types
80+
for each activation function.
81+
When implementing a new activation function in the library, you need to define
82+
a new concrete type that extends the abstract activation function type.
83+
The concrete type must have `eval` and `eval_prime` methods that evaluate the
84+
function and its derivative, respectively.

‎README.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ Read the paper [here](https://arxiv.org/abs/1902.06714).
1111
- [Building with CMake](https://github.com/modern-fortran/neural-fortran#building-with-cmake)
1212
* [Examples](https://github.com/modern-fortran/neural-fortran#examples)
1313
* [API documentation](https://github.com/modern-fortran/neural-fortran#api-documentation)
14+
* [Contributing](https://github.com/modern-fortran/neural-fortran#contributing)
1415
* [Acknowledgement](https://github.com/modern-fortran/neural-fortran#acknowledgement)
1516
* [Related projects](https://github.com/modern-fortran/neural-fortran#related-projects)
1617

@@ -235,6 +236,11 @@ ford ford.md
235236
from the neural-fortran top-level directory to generate the API documentation in doc/html.
236237
Point your browser to doc/html/index.html to read it.
237238

239+
## Contributing
240+
241+
This [Contributing guide](CONTRIBUTING.md) briefly describes the code organization.
242+
It may be useful to read if you want to contribute a new feature to neural-fortran.
243+
238244
## Acknowledgement
239245

240246
Thanks to all open-source contributors to neural-fortran:

0 commit comments

Comments
 (0)
Please sign in to comment.