RecurrentLayers.jl extends Flux.jl recurrent layers offering by providing implementations of additional recurrent layers not available in base deep learning libraries.
The package offers multiple layers for Flux.jl. Currently there are 20+ cells implemented, together with multiple higher level implementations:
-
Modifications of vanilla RNNs: Independently recurrent neural networks, Structurally constrained recurrent neural network, FastRNN, and Typed RNNs
-
Variations over gated architectures: Minimal gated unit, Light gated recurrent networks, Recurrent addictive networks, Light recurrent networks, Neural architecture search networks, Evolving recurrent neural networks, Peephole long short term memory, FastGRNN, Just another network, Chaos free network, Typed gated recurrent unit, and Typed long short term memory.
-
Discretized ordinary differential equation formulations of RNNs: Long expressive memory networks, Coupled oscillatory recurrent neural unit, Antisymmetric recurrent neural network with its gated version, and Undamped independent controlled oscillatory recurrent neural network.
-
Additional more complex architectures: Recurrent highway networks, and FastSlow RNNs
-
Additional wrappers: Stacked RNNs
You can install RecurrentLayers
using either of:
using Pkg
Pkg.add("RecurrentLayers")
julia> ]
pkg> add RecurrentLayers
The workflow is identical to any recurrent Flux layer: just plug in a new recurrent layer in your workflow and test it out!
This project is licensed under the MIT License, except for nas_cell.jl
, which is licensed under the Apache License, Version 2.0.
nas_cell.jl
is a reimplementation of the NASCell from TensorFlow and is licensed under the Apache License 2.0. See the file header andLICENSE-APACHE
for details.- All other files are licensed under the MIT License. See
LICENSE-MIT
for details.
LuxRecurrentLayers.jl: Equivalent library, providing recurrent layers for Lux.jl.
ReservoirComputing.jl: Reservoir computing utilities for scientific machine learning. Essentially gradient free trained neural networks.