@@ -21,8 +21,9 @@ Read the paper [here](https://arxiv.org/abs/1902.06714).
21
21
RMSProp, Adagrad, Adam, AdamW
22
22
* More than a dozen activation functions and their derivatives
23
23
* Loss functions and metrics: Quadratic, Mean Squared Error, Pearson Correlation etc.
24
- * Loading dense and convolutional models from Keras HDF5 (.h5) files
25
24
* Data-based parallelism
25
+ * Loading dense and convolutional models from Keras HDF5 (.h5) files
26
+ (see the [ nf-keras-hdf5] ( https://github.com/neural-fortran/nf-keras-hdf5 ) add-on)
26
27
27
28
### Available layers
28
29
@@ -51,14 +52,8 @@ cd neural-fortran
51
52
Required dependencies are:
52
53
53
54
* A Fortran compiler
54
- * [ HDF5] ( https://www.hdfgroup.org/downloads/hdf5/ )
55
- (must be provided by the OS package manager or your own build from source)
56
- * [ functional-fortran] ( https://github.com/wavebitscientific/functional-fortran ) ,
57
- [ h5fortran] ( https://github.com/geospace-code/h5fortran ) ,
58
- [ json-fortran] ( https://github.com/jacobwilliams/json-fortran )
59
- (all handled by neural-fortran's build systems, no need for a manual install)
60
55
* [ fpm] ( https://github.com/fortran-lang/fpm ) or
61
- [ CMake] ( https://cmake.org ) for building the code
56
+ [ CMake] ( https://cmake.org ) to build the code
62
57
63
58
Optional dependencies are:
64
59
@@ -79,23 +74,7 @@ Compilers tested include:
79
74
With gfortran, the following will create an optimized build of neural-fortran:
80
75
81
76
```
82
- fpm build \
83
- --profile release \
84
- --flag "-I$HDF5INC -L$HDF5LIB"
85
- ```
86
-
87
- HDF5 is now a required dependency, so you have to provide it to fpm.
88
- The above command assumes that the ` HDF5INC ` and ` HDF5LIB ` environment
89
- variables are set to the include and library paths, respectively, of your
90
- HDF5 install.
91
-
92
- If you use Conda, the following instructions work:
93
-
94
- ```
95
- conda create -n nf hdf5
96
- conda activate nf
97
- fpm build --profile release --flag "-I$CONDA_PREFIX/include -L$CONDA_PREFIX/lib -Wl,-rpath -Wl,$CONDA_PREFIX/lib"
98
- fpm test --profile release --flag "-I$CONDA_PREFIX/include -L$CONDA_PREFIX/lib -Wl,-rpath -Wl,$CONDA_PREFIX/lib"
77
+ fpm build --profile release
99
78
```
100
79
101
80
#### Building in parallel mode
@@ -106,25 +85,20 @@ Once installed, use the compiler wrappers `caf` and `cafrun` to build and execut
106
85
in parallel, respectively:
107
86
108
87
```
109
- fpm build \
110
- --compiler caf \
111
- --profile release \
112
- --flag "-I$HDF5INC -L$HDF5LIB"
88
+ fpm build --compiler caf --profile release
113
89
```
114
90
115
91
#### Testing with fpm
116
92
117
93
```
118
- fpm test \
119
- --profile release \
120
- --flag "-I$HDF5INC -L$HDF5LIB"
94
+ fpm test --profile release
121
95
```
122
96
123
97
For the time being, you need to specify the same compiler flags to ` fpm test `
124
98
as you did in ` fpm build ` so that fpm knows it should use the same build
125
99
profile.
126
100
127
- See [ Fortran Package Manager] ( https://github.com/fortran-lang/fpm ) for more info on fpm.
101
+ See the [ Fortran Package Manager] ( https://github.com/fortran-lang/fpm ) for more info on fpm.
128
102
129
103
### Building with CMake
130
104
@@ -156,8 +130,7 @@ cafrun -n 4 bin/mnist # run MNIST example on 4 cores
156
130
#### Building with a different compiler
157
131
158
132
If you want to build with a different compiler, such as Intel Fortran,
159
- set the ` HDF5_ROOT ` environment variable to the root path of your
160
- Intel HDF5 build, and specify ` FC ` when issuing ` cmake ` :
133
+ specify ` FC ` when issuing ` cmake ` :
161
134
162
135
```
163
136
FC=ifort cmake ..
@@ -213,6 +186,7 @@ You can configure neural-fortran by setting the appropriate options before
213
186
including the subproject.
214
187
215
188
The following should be added in the CMake file of your directory:
189
+
216
190
``` cmake
217
191
if(NOT TARGET "neural-fortran::neural-fortran")
218
192
find_package("neural-fortran" REQUIRED)
@@ -230,11 +204,7 @@ examples, in increasing level of complexity:
230
204
3 . [ dense_mnist] ( example/dense_mnist.f90 ) : Hand-written digit recognition
231
205
(MNIST dataset) using a dense (fully-connected) network
232
206
4 . [ cnn_mnist] ( example/cnn_mnist.f90 ) : Training a CNN on the MNIST dataset
233
- 5 . [ dense_from_keras] ( example/dense_from_keras.f90 ) : Creating a pre-trained
234
- dense model from a Keras HDF5 file and running the inference.
235
- 6 . [ cnn_from_keras] ( example/cnn_from_keras.f90 ) : Creating a pre-trained
236
- convolutional model from a Keras HDF5 file and running the inference.
237
- 7 . [ get_set_network_params] ( example/get_set_network_params.f90 ) : Getting and
207
+ 5 . [ get_set_network_params] ( example/get_set_network_params.f90 ) : Getting and
238
208
setting hyperparameters of a network.
239
209
240
210
The examples also show you the extent of the public API that's meant to be
0 commit comments