-
Notifications
You must be signed in to change notification settings - Fork 51
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
0 parents
commit 560d98c
Showing
264 changed files
with
23,045 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,30 @@ | ||
name: Tests | ||
|
||
on: [push, pull_request] | ||
jobs: | ||
pre-commit: | ||
runs-on: ubuntu-latest | ||
steps: | ||
- uses: actions/checkout@v4 | ||
- uses: actions/setup-python@v5 | ||
with: | ||
python-version: "3.10" | ||
- name: Run pre-commit hooks | ||
uses: pre-commit/[email protected] | ||
with: | ||
extra_args: --all-files | ||
pytest: | ||
runs-on: ubuntu-latest | ||
steps: | ||
- uses: actions/checkout@v4 | ||
- uses: actions/setup-python@v5 | ||
with: | ||
python-version: "3.10" | ||
cache: "pip" | ||
- name: Install the_well | ||
run: pip install .[benchmark,dev] --extra-index-url https://download.pytorch.org/whl/cpu | ||
- name: Run tests | ||
env: | ||
PYTHONPATH: ${{ github.workspace }} | ||
PY_COLORS: "1" | ||
run: pytest tests |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,100 @@ | ||
# Ignore files generated by the build process | ||
build/ | ||
dist/ | ||
*.egg-info/ | ||
|
||
# Ignore system and IDE files | ||
.DS_Store | ||
Thumbs.db | ||
.idea/ | ||
|
||
#Ignoring the data | ||
datasets/active_matter/data/ | ||
datasets/active_matter_v2/ | ||
|
||
datasets/euler_multi_quadrants_openBC/data/ | ||
datasets/euler_multi_quadrants_periodicBC/data/ | ||
|
||
datasets/gray_scott_reaction_diffusion/data/ | ||
datasets/datasets/ | ||
|
||
output_slurm/ | ||
datasets/helmholtz_staircase/data/ | ||
datasets/viscoelastic_instability/data/ | ||
|
||
2D/neutron_star_disks/ | ||
2D/planetswe/data/ | ||
|
||
datasets/rayleigh_benard/data/ | ||
datasets/acoustic_scattering_inclusions/old_and_problematic_data/ | ||
datasets/shear_flow/data/ | ||
datasets/supernova_explosion_128/data/ | ||
datasets/supernova_explosion_64/data/ | ||
datasets/turbulence_gravity_cooling/data/ | ||
datasets/rayleigh_taylor_instability/data/ | ||
datasets/turbulent_radiative_layer_3D/data/ | ||
datasets/split_turbulent_radiative_layer_3D/ | ||
datasets/turbulent_radiative_layer_2D/data/ | ||
datasets/acoustic_scattering_discontinuous/data/ | ||
datasets/acoustic_scattering_inclusions/data/ | ||
datasets/acoustic_scattering_maze/data/ | ||
datasets/planetswe/data/ | ||
datasets/post_neutron_star_merger/data/ | ||
datasets/acoustic_scattering_discontinuous/gif/ | ||
datasets/acoustic_scattering_inclusions/gif/ | ||
datasets/acoustic_scattering_maze/gif/ | ||
the_well/benchmark/scripts_to_launch/ | ||
the_well/benchmark/write_bash_script.ipynb | ||
the_well/benchmark/checkpoints/ | ||
datasets/convective_envelope_rsg/data/ | ||
datasets/MHD_64/data/ | ||
datasets/MHD_256/data/ | ||
datasets/convective_envelope_rsg/sim.mp4 | ||
testing_before_adding/ | ||
viz/ | ||
venv_benchmark_well/ | ||
wellbench/ | ||
benchmarking_results/ | ||
|
||
# Ignore logs and temporary files | ||
*.log | ||
*.tmp | ||
*.pt | ||
*.gif | ||
|
||
#ignore HDF5 files | ||
*.hdf5 | ||
*.h5 | ||
|
||
# Ignore compiled binaries and libraries | ||
*.exe | ||
*.dll | ||
*.so | ||
|
||
# Ignore package manager directories | ||
node_modules/ | ||
vendor/ | ||
|
||
# Ignore environment-specific files | ||
.env | ||
.env.local | ||
.env.*.local | ||
|
||
# Ignore sensitive or private information | ||
secrets.txt | ||
credentials.json | ||
|
||
# Ignore backup files | ||
*.bak | ||
*.swp | ||
|
||
# Ignore generated files | ||
*.min.js | ||
*.min.css | ||
__pycache__ | ||
|
||
# Ignore run generated output | ||
outputs/ | ||
wandb/ | ||
datasets/rt_experimental | ||
check_well_data_4059043.out |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
repos: | ||
- repo: https://github.com/astral-sh/ruff-pre-commit | ||
rev: v0.6.4 | ||
hooks: | ||
- id: ruff | ||
args: [--fix] | ||
- id: ruff-format | ||
- repo: https://github.com/pre-commit/pre-commit-hooks | ||
rev: v5.0.0 | ||
hooks: | ||
- id: check-merge-conflict | ||
- id: check-toml | ||
- id: check-yaml | ||
args: [--unsafe] | ||
- id: end-of-file-fixer | ||
- id: mixed-line-ending | ||
args: [--fix=lf] | ||
- id: trailing-whitespace |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,14 @@ | ||
BSD 3-Clause License | ||
|
||
Copyright (c) 2024 Polymathic AI. | ||
All rights reserved. | ||
|
||
Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: | ||
|
||
1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. | ||
|
||
2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. | ||
|
||
3. Neither the name of Polymathic AI nor the names of the Well contributors may be used to endorse or promote products derived from this software without specific prior written permission. | ||
|
||
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,149 @@ | ||
<div align="center"> | ||
<img src="https://raw.githubusercontent.com/PolymathicAI/the_well/master/docs/assets/images/the_well_color.svg" width="60%"/> | ||
</div> | ||
|
||
<br> | ||
|
||
# The Well: 15TB of Physics Simulations | ||
|
||
Welcome to the Well, a large-scale collection of machine learning datasets containing numerical simulations of a wide variety of spatiotemporal physical systems. The Well draws from domain scientists and numerical software developers to provide 15TB of data across 16 datasets covering diverse domains such as biological systems, fluid dynamics, acoustic scattering, as well as magneto-hydrodynamic simulations of extra-galactic fluids or supernova explosions. These datasets can be used individually or as part of a broader benchmark suite for accelerating research in machine learning and computational sciences. | ||
|
||
## Tap into the Well | ||
|
||
Once the Well package installed and the data downloaded you can use them in your training pipeline. | ||
|
||
```python | ||
from the_well.data import WellDataset | ||
from torch.utils.data import DataLoader | ||
|
||
trainset = WellDataset( | ||
well_base_path="path/to/base", | ||
well_dataset_name="name_of_the_dataset", | ||
well_split_name="train" | ||
) | ||
train_loader = DataLoader(trainset) | ||
|
||
for batch in train_loader: | ||
... | ||
``` | ||
|
||
For more information regarding the interface, please refer to the [API](https://github.com/PolymathicAI/the_well/tree/master/docs/api.md) and the [tutorials](https://github.com/PolymathicAI/the_well/blob/master/docs/tutorials/dataset.ipynb). | ||
|
||
### Installation | ||
|
||
If you plan to use The Well datasets to train or evaluate deep learning models, we recommend to use a machine with enough computing resources. | ||
We also recommend creating a new Python (>=3.10) environment to install the Well. For instance, with [venv](https://docs.python.org/3/library/venv.html): | ||
|
||
``` | ||
python -m venv path/to/env | ||
source path/to/env/activate/bin | ||
``` | ||
|
||
#### From PyPI | ||
|
||
The Well package can be installed directly from PyPI. | ||
|
||
``` | ||
pip install the_well | ||
``` | ||
|
||
#### From Source | ||
|
||
It can also be installed from source. For this, clone the [repository](https://github.com/PolymathicAI/the_well) and install the package with its dependencies. | ||
|
||
``` | ||
git clone https://github.com/PolymathicAI/the_well | ||
cd the_well | ||
pip install . | ||
``` | ||
|
||
Depending on your acceleration hardware, you can specify `--extra-index-url` to install the relevant PyTorch version. For example, use | ||
|
||
``` | ||
pip install . --extra-index-url https://download.pytorch.org/whl/cu121 | ||
``` | ||
|
||
to install the dependencies built for CUDA 12.1. | ||
|
||
#### Benchmark Dependencies | ||
|
||
If you want to run the benchmarks, you should install additional dependencies. | ||
|
||
``` | ||
pip install the_well[benchmark] | ||
``` | ||
|
||
### Downloading the Data | ||
|
||
The Well datasets range between 6.9GB and 5.1TB of data each, for a total of 15TB for the full collection. Ensure that your system has enough free disk space to accomodate the datasets you wish to download. | ||
|
||
Once `the_well` is installed, you can use the `the-well-download` command to download any dataset of The Well. | ||
|
||
``` | ||
the-well-download --base-path path/to/base --dataset active_matter --split train | ||
``` | ||
|
||
If `--dataset` and `--split` are omitted, all datasets and splits will be downloaded. This could take a while! | ||
|
||
### Streaming from Hugging Face | ||
|
||
Most of the Well datasets are also hosted on [Hugging Face](https://huggingface.co/polymathic-ai). Data can be streamed directly from the hub using the following code. | ||
|
||
```python | ||
from the_well.data import WellDataset | ||
from torch.utils.data import DataLoader | ||
|
||
# The following line may take a couple of minutes to instantiate the datamodule | ||
trainset = WellDataset( | ||
well_base_path="hf://datasets/polymathic-ai/", # access from HF hub | ||
well_dataset_name="active_matter", | ||
well_split_name="train", | ||
) | ||
train_loader = DataLoader(trainset) | ||
|
||
for batch in train_loader: | ||
... | ||
``` | ||
|
||
For better performance in large training, we advise [downloading the data locally](#downloading-the-data) instead of streaming it over the network. | ||
|
||
## Benchmark | ||
|
||
The repository allows benchmarking surrogate models on the different datasets that compose the Well. Some state-of-the-art models are already implemented in [`models`](https://github.com/PolymathicAI/the_well/tree/master/the_well/benchmark/models), while [dataset classes](https://github.com/PolymathicAI/the_well/tree/master/the_well/data) handle the raw data of the Well. | ||
The benchmark relies on [a training script](https://github.com/PolymathicAI/the_well/blob/master/the_well/benchmark/train.py) that uses [hydra](https://hydra.cc/) to instantiate various classes (e.g. dataset, model, optimizer) from [configuration files](https://github.com/PolymathicAI/the_well/tree/master/the_well/benchmark/configs). | ||
|
||
For instance, to run the training script of default FNO architecture on the active matter dataset, launch the following commands: | ||
|
||
```bash | ||
cd the_well/benchmark | ||
python train.py experiment=fno server=local data=active_matter | ||
``` | ||
|
||
Each argument corresponds to a specific configuration file. In the command above `server=local` indicates the training script to use [`local.yaml`](https://github.com/PolymathicAI/the_well/tree/master/the_well/benchmark/configs/server/local.yaml), which just declares the relative path to the data. The configuration can be overridden directly or edited with new YAML files. Please refer to [hydra documentation](https://hydra.cc/) for editing configuration. | ||
|
||
You can use this command within a sbatch script to launch the training with Slurm. | ||
|
||
## Citation | ||
|
||
This project has been led by the <a href="https://polymathic-ai.org/">Polymathic AI</a> organization, in collaboration with researchers from the Flatiron Institute, University of Colorado Boulder, University of Cambridge, New York University, Rutgers University, Cornell University, University of Tokyo, Los Alamos Natioinal Laboratory, University of Califronia, Berkeley, Princeton University, CEA DAM, and University of Liège. | ||
|
||
If you find this project useful for your research, please consider citing | ||
|
||
``` | ||
@inproceedings{ohana2024thewell, | ||
title={The Well: a Large-Scale Collection of Diverse Physics Simulations for Machine Learning}, | ||
author={Ruben Ohana and Michael McCabe and Lucas Thibaut Meyer and Rudy Morel and Fruzsina Julia Agocs and Miguel Beneitez and Marsha Berger and Blakesley Burkhart and Stuart B. Dalziel and Drummond Buschman Fielding and Daniel Fortunato and Jared A. Goldberg and Keiya Hirashima and Yan-Fei Jiang and Rich Kerswell and Suryanarayana Maddu and Jonah M. Miller and Payel Mukhopadhyay and Stefan S. Nixon and Jeff Shen and Romain Watteaux and Bruno R{\'e}galdo-Saint Blancard and Fran{\c{c}}ois Rozet and Liam Holden Parker and Miles Cranmer and Shirley Ho}, | ||
booktitle={The Thirty-eight Conference on Neural Information Processing Systems Datasets and Benchmarks Track}, | ||
year={2024}, | ||
url={https://openreview.net/forum?id=00Sx577BT3} | ||
} | ||
``` | ||
|
||
## Contact | ||
|
||
For questions regarding this project, please contact [Ruben Ohana](https://rubenohana.github.io/) and [Michael McCabe](https://mikemccabe210.github.io/) at $\small\texttt{\{rohana,mmcabe\}@flatironinstitute.org}$. | ||
|
||
|
||
## Bug Reports and Feature Requests | ||
|
||
To report a bug (in the data or the code), request a feature or simply ask a question, you can [open an issue](https://github.com/PolymathicAI/the_well/issues) on the [repository](https://github.com/PolymathicAI/the_well). |
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
dataset_name: MHD_256 | ||
n_spatial_dims: 3 | ||
spatial_resolution: | ||
- 256 | ||
- 256 | ||
- 256 | ||
scalar_names: [] | ||
constant_scalar_names: | ||
- Ma | ||
- Ms | ||
field_names: | ||
0: | ||
- density | ||
1: | ||
- magnetic_field_x | ||
- magnetic_field_y | ||
- magnetic_field_z | ||
- velocity_x | ||
- velocity_y | ||
- velocity_z | ||
2: [] | ||
constant_field_names: | ||
0: [] | ||
1: [] | ||
2: [] | ||
boundary_condition_types: | ||
- PERIODIC | ||
n_files: 10 | ||
n_trajectories_per_file: | ||
- 1 | ||
- 1 | ||
- 1 | ||
- 1 | ||
- 1 | ||
- 1 | ||
- 1 | ||
- 1 | ||
- 1 | ||
- 1 | ||
n_steps_per_trajectory: | ||
- 100 | ||
- 100 | ||
- 100 | ||
- 100 | ||
- 100 | ||
- 100 | ||
- 100 | ||
- 100 | ||
- 100 | ||
- 100 | ||
grid_type: cartesian |
Oops, something went wrong.