diff --git a/.github/workflows/main.yml b/.github/workflows/main.yml
index fe1ed24..ad3be85 100644
--- a/.github/workflows/main.yml
+++ b/.github/workflows/main.yml
@@ -1,7 +1,11 @@
name: CI
+
on:
+ push:
+ branches: [main]
pull_request:
branches: [main]
+
jobs:
Awesome_Lint:
runs-on: ubuntu-latest
diff --git a/readme.md b/readme.md
index 394a240..133a331 100644
--- a/readme.md
+++ b/readme.md
@@ -18,20 +18,20 @@ This is a curated list of awesome JAX libraries, projects, and other resources.
## Libraries
- Neural Network Libraries
- - [Flax](https://github.com/google/flax) - a flexible library with the largest user base of all JAX NN libraries.
- - [Haiku](https://github.com/deepmind/dm-haiku) - focused on simplicity, created by the authors of Sonnet at DeepMind.
- - [Objax](https://github.com/google/objax) - has an object oriented design similar to PyTorch.
- - [Elegy](https://poets-ai.github.io/elegy/) - implements the Keras API with some improvements.
- - [RLax](https://github.com/deepmind/rlax) - library for implementing reinforcement learning agent.
- - [Trax](https://github.com/google/trax) - a "batteries included" deep learning library focused on providing solutions for common workloads.
- - [Jraph](https://github.com/deepmind/jraph) - a lightweight graph neural network library.
- - [Neural Tangents](https://github.com/google/neural-tangents) - high-level API for specifying neural networks of both finite and _infinite_ width.
-- [NumPyro](https://github.com/pyro-ppl/numpyro) - probabilistic programming based on the Pyro library.
-- [Chex](https://github.com/deepmind/chex) - utilities to write and test reliable JAX code.
-- [Optax](https://github.com/deepmind/optax) - a gradient processing and optimization library.
-- [JAX, M.D.](https://github.com/google/jax-md) - accelerated, differential molecular dynamics.
-- [Coax](https://github.com/microsoft/coax) - turn RL papers into code, the easy way.
-- [SymJAX](https://github.com/SymJAX/SymJAX) - symbolic CPU/GPU/TPU programming.
+ - [Flax](https://github.com/google/flax) - Centered on flexibility and clarity.
+ - [Haiku](https://github.com/deepmind/dm-haiku) - Focused on simplicity, created by the authors of Sonnet at DeepMind.
+ - [Objax](https://github.com/google/objax) - Has an object oriented design similar to PyTorch.
+ - [Elegy](https://poets-ai.github.io/elegy/) - Implements the Keras API with some improvements.
+ - [RLax](https://github.com/deepmind/rlax) - Library for implementing reinforcement learning agents.
+ - [Trax](https://github.com/google/trax) - "Batteries included" deep learning library focused on providing solutions for common workloads.
+ - [Jraph](https://github.com/deepmind/jraph) - Lightweight graph neural network library.
+ - [Neural Tangents](https://github.com/google/neural-tangents) - High-level API for specifying neural networks of both finite and _infinite_ width.
+- [NumPyro](https://github.com/pyro-ppl/numpyro) - Probabilistic programming based on the Pyro library.
+- [Chex](https://github.com/deepmind/chex) - Utilities to write and test reliable JAX code.
+- [Optax](https://github.com/deepmind/optax) - Gradient processing and optimization library.
+- [JAX, M.D.](https://github.com/google/jax-md) - Accelerated, differential molecular dynamics.
+- [Coax](https://github.com/microsoft/coax) - Turn RL papers into code, the easy way.
+- [SymJAX](https://github.com/SymJAX/SymJAX) - Symbolic CPU/GPU/TPU programming.
- [mcx](https://github.com/rlouf/mcx) - Express & compile probabilistic programs for performant inference.
@@ -42,12 +42,12 @@ This is a curated list of awesome JAX libraries, projects, and other resources.
This section contains libraries that are well-made and useful, but have not necessarily been battle-tested by a large userbase yet.
- Neural Network Libraries
- - [Parallax](https://github.com/srush/parallax) - prototype immutable torch modules for JAX.
- - [FedJAX](https://github.com/google/fedjax) - federated learning in JAX, built on Optax and Haiku.
-- [jax-unirep](https://github.com/ElArkk/jax-unirep) - library implementing the [UniRep model](https://www.nature.com/articles/s41592-019-0598-1) for protein machine learning applications.
+ - [Parallax](https://github.com/srush/parallax) - Prototype immutable torch modules for JAX.
+ - [FedJAX](https://github.com/google/fedjax) - Federated learning in JAX, built on Optax and Haiku.
+- [jax-unirep](https://github.com/ElArkk/jax-unirep) - Library implementing the [UniRep model](https://www.nature.com/articles/s41592-019-0598-1) for protein machine learning applications.
- [jax-flows](https://github.com/ChrisWaites/jax-flows) - Normalizing flows in JAX.
- [sklearn-jax-kernels](https://github.com/ExpectationMax/sklearn-jax-kernels) - `scikit-learn` kernel matrices using JAX.
-- [jax-cosmo](https://github.com/DifferentiableUniverseInitiative/jax_cosmo) - a differentiable cosmology library.
+- [jax-cosmo](https://github.com/DifferentiableUniverseInitiative/jax_cosmo) - Differentiable cosmology library.
- [efax](https://github.com/NeilGirdhar/efax) - Exponential Families in JAX.
- [mpi4jax](https://github.com/PhilipVinc/mpi4jax) - Combine MPI operations with your Jax code on CPUs and GPUs.
@@ -56,8 +56,8 @@ This section contains libraries that are well-made and useful, but have not nece
## Models and Projects
-- [Performer](https://github.com/google-research/google-research/tree/master/performer/fast_attention/jax) - A Flax implementation of the Performer (linear transformer via FAVOR+) architecture.
-- [Reformer](https://github.com/google/trax/tree/master/trax/models/reformer) - An implementation of the Reformer (efficient transformer) architecture.
+- [Performer](https://github.com/google-research/google-research/tree/master/performer/fast_attention/jax) - Flax implementation of the Performer (linear transformer via FAVOR+) architecture.
+- [Reformer](https://github.com/google/trax/tree/master/trax/models/reformer) - Implementation of the Reformer (efficient transformer) architecture.
- [Vision Transformer](https://github.com/google-research/vision_transformer) - Official implementation in Flax of [_An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale_](https://arxiv.org/abs/2010.11929).
- [Fourier Feature Networks](https://github.com/tancik/fourier-feature-networks) - Official implementation of [_Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains_](https://people.eecs.berkeley.edu/~bmild/fourfeat).
- [Flax Models](https://github.com/google-research/google-research/tree/master/flax_models) - Collection of open-sourced Flax models.
@@ -73,7 +73,7 @@ This section contains libraries that are well-made and useful, but have not nece
## Videos
- [NeurIPS 2020: JAX Ecosystem Meetup](https://www.youtube.com/watch?v=iDxJxIyzSiM) - JAX, its use at DeepMind, and discussion between engineers, scientists, and JAX core team.
-- [Introduction to JAX](https://youtu.be/0mVmRHMaOJ4) - A simple neural network from scratch in JAX.
+- [Introduction to JAX](https://youtu.be/0mVmRHMaOJ4) - Simple neural network from scratch in JAX.
- [JAX: Accelerated Machine Learning Research | SciPy 2020 | VanderPlas](https://youtu.be/z-WSrQDXkuM) - JAX's core design, how it's powering new research, and how you can start using it.
- [Bayesian Programming with JAX + NumPyro — Andy Kitchen](https://youtu.be/CecuWGpoztw) - Introduction to Bayesian modelling using NumPyro.
- [JAX: Accelerated machine-learning research via composable function transformations in Python | NeurIPS 2019 | Skye Wanderman-Milne](https://slideslive.com/38923687/jax-accelerated-machinelearning-research-via-composable-function-transformations-in-python) - JAX intro presentation in [_Program Transformations for Machine Learning_](https://program-transformations.github.io) workshop.
@@ -87,7 +87,7 @@ This section contains libraries that are well-made and useful, but have not nece
This section contains papers focused on JAX (e.g. JAX-based library whitepapers, research on JAX, etc). Papers implemented in JAX are listed in the [Models/Projects](#projects) section.
-- [__Compiling machine learning programs via high-level tracing__. Roy Frostig, Matthew James Johnson, Chris Leary. _MLSys 2018_.](https://mlsys.org/Conferences/doc/2018/146.pdf) - This white paper describes an early version of JAX, detailing how computation is traced and compiled.
+- [__Compiling machine learning programs via high-level tracing__. Roy Frostig, Matthew James Johnson, Chris Leary. _MLSys 2018_.](https://mlsys.org/Conferences/doc/2018/146.pdf) - White paper describing an early version of JAX, detailing how computation is traced and compiled.
- [__JAX, M.D.: A Framework for Differentiable Physics__. Samuel S. Schoenholz, Ekin D. Cubuk. _NeurIPS 2020_.](https://arxiv.org/abs/1912.04232) - Introduces JAX, M.D., a differentiable physics library which includes simulation environments, interaction potentials, neural networks, and more.
- [__Enabling Fast Differentially Private SGD via Just-in-Time Compilation and Vectorization__. Pranav Subramani, Nicholas Vadivelu, Gautam Kamath. _arXiv 2020_.](https://arxiv.org/abs/2010.09063) - Uses JAX's JIT and VMAP to achieve faster differentially private than existing libraries.
@@ -99,7 +99,7 @@ This section contains papers focused on JAX (e.g. JAX-based library whitepapers,
- [Using JAX to accelerate our research by David Budden and Matteo Hessel](https://deepmind.com/blog/article/using-jax-to-accelerate-our-research) - Describes the state of JAX and the JAX ecosystem at DeepMind.
- [Getting started with JAX (MLPs, CNNs & RNNs) by Robert Lange](https://roberttlange.github.io/posts/2020/03/blog-post-10/) - Neural network building blocks from scratch with the basic JAX operators.
- [Tutorial: image classification with JAX and Flax Linen by 8bitmp3](https://github.com/8bitmp3/JAX-Flax-Tutorial-Image-Classification-with-Linen) - Learn how to create a simple convolutional network with the Linen API by Flax and train it to recognize handwritten digits.
-- [Plugging Into JAX by Nick Doiron](https://medium.com/swlh/plugging-into-jax-16c120ec3302) - Compared Flax, Haiku, and Objax on the Kaggle flower classification challenge.
+- [Plugging Into JAX by Nick Doiron](https://medium.com/swlh/plugging-into-jax-16c120ec3302) - Compares Flax, Haiku, and Objax on the Kaggle flower classification challenge.
- [Meta-Learning in 50 Lines of JAX by Eric Jang](https://blog.evjang.com/2019/02/maml-jax.html) - Introduction to both JAX and Meta-Learning.
- [Normalizing Flows in 100 Lines of JAX by Eric Jang](https://blog.evjang.com/2019/07/nf-jax.html) - Concise implementation of [RealNVP](https://arxiv.org/abs/1605.08803).
- [Differentiable Path Tracing on the GPU/TPU by Eric Jang](https://blog.evjang.com/2019/11/jaxpt.html) - Tutorial on implementing path tracing.