Skip to content

Commit

Permalink
Merge pull request #107 from rwegener2/main
Browse files Browse the repository at this point in the history
fix small typos
  • Loading branch information
daavid00 authored Jan 24, 2025
2 parents 066b7a8 + 18d29f1 commit 6fcd7c4
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions paper/paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,17 +23,17 @@ bibliography: paper.bib

# Summary

The imperative to achieve climate change goals and the increasing worldwide demand for energy have made geological carbon storage (GCS) technology more relevant today. Since utilizing computational models is essential for planning large-scale GCS projects, it is crucial to benchmark simulation tools to enhance confidence in their results. Inspired by a recent validation study for laboratory-scale CO$_2$ storage [@Flemisch:2024], a new comparative solution project (CSP) was launched to simulate both lab- and field-scale CO$_2$ storage [@Nordbotten:2024]. This project is called the 11th Society of Petroleum Engineers CSP, and we refer to it as the SPE11 benchmark. The main objective for the SPE11 benchmark is to provide a common platform and reference case for numerical simulation of GCS. A community effort was run by the "Early Access Team" to create utility scripts and input files for popular simulators to make participation more accessible. As part of the "Early Access Team", we have developed and made open the `pyopmspe11` tool which facilitates reproducible solutions to the SPE11 benchmark. This tool serves as a common starting point for developing and testing new GCS simulation technology. Due to its user-friendly functionality (e.g., generation of different type of grids at different grid resolutions, flexibility to choose different rock and fluid properties, flexibility to define well/source locations and schedule for operations), it is expected that its impact will extend far beyond the initial benchmark study (e.g., studies focusing on grid refinement, upscaling/coarsening approaches, numerical solvers, optimization/history matching techniques).
The imperative to achieve climate change goals and the increasing worldwide demand for energy have made geological carbon storage (GCS) technology more relevant today. Since utilizing computational models is essential for planning large-scale GCS projects, it is crucial to benchmark simulation tools to enhance confidence in their results. Inspired by a recent validation study for laboratory-scale CO$_2$ storage [@Flemisch:2024], a new comparative solution project (CSP) was launched to simulate both lab- and field-scale CO$_2$ storage [@Nordbotten:2024]. This project is called the 11th Society of Petroleum Engineers CSP, and we refer to it as the SPE11 benchmark. The main objective for the SPE11 benchmark is to provide a common platform and reference case for numerical simulation of GCS. A community effort was run by the "Early Access Team" to create utility scripts and input files for popular simulators to make participation more accessible. As part of the "Early Access Team", we have developed and made open the `pyopmspe11` tool which facilitates reproducible solutions to the SPE11 benchmark. This tool serves as a common starting point for developing and testing new GCS simulation technology. Due to its user-friendly functionality (e.g., generation of different types of grids at different grid resolutions, flexibility to choose different rock and fluid properties, flexibility to define well/source locations and schedule for operations), it is expected that its impact will extend far beyond the initial benchmark study (e.g., studies focusing on grid refinement, upscaling/coarsening approaches, numerical solvers, optimization/history matching techniques).

![Generated model by the configuration file `spe11c_cp_ca20e6cells.txt` in the [examples folder](https://github.com/OPM/pyopmspe11/tree/main/examples) (the model corresponds to the SPE11C Case using a corner-point grid with 21729920 active cells).](paper.png){ width=100% }

# Statement of need

Geological carbon storage (GCS) applications benefit from both commercial and open-source simulators. OPM Flow is an open-source simulator for subsurface applications such as hydrocarbon recovery, CO$_2$ storage, and H$_2$ storage. The typical workflow in GCS simulations starts with defining the simulation model (e.g., grid, heterogeinity, physics, fluid properties, boundary conditions, wells), then setting the simulator parameters (e.g., tolerances, linear solvers, partition algorithms), after running the simulation, and finally visualization/analysis of the simulation results (e.g., CO$_2$ plume distance to the boundaries, caprock integrity). Here we refer to the first two steps as preprocessing and the final step as postprocessing. Notable works are available in JOSS for pre-/postprocessing of simulation data, e.g., @Beucher:2019, @Sullivan:2019, @Fraters:2024, @Kaus:2024. However, preprocessing and postprocessing can be challenging for everyone even if you know what you are doing. Additionally, setting up and running simulations requires computational expertise. To bridge this gap, developers can simplify the setup of numerical studies by using user-friendly approaches, such as configuration files. This not only ensures reproducibility of results but also facilitates flexible testing of different simulator parameters and allows for easy extension to further studies.
Geological carbon storage (GCS) applications benefit from both commercial and open-source simulators. OPM Flow is an open-source simulator for subsurface applications such as hydrocarbon recovery, CO$_2$ storage, and H$_2$ storage. The typical workflow in GCS simulations starts with defining the simulation model (e.g., grid, heterogeinity, physics, fluid properties, boundary conditions, wells), then setting the simulator parameters (e.g., tolerances, linear solvers, partition algorithms), then running the simulation, and finally visualization/analysis of the simulation results (e.g., CO$_2$ plume distance to the boundaries, caprock integrity). Here we refer to the first two steps as preprocessing and the final step as postprocessing. Notable works are available in JOSS for pre-/postprocessing of simulation data, e.g., @Beucher:2019, @Sullivan:2019, @Fraters:2024, @Kaus:2024. However, preprocessing and postprocessing can be challenging for everyone even if you know what you are doing. Additionally, setting up and running simulations requires computational expertise. To bridge this gap, developers can simplify the setup of numerical studies by using user-friendly approaches, such as configuration files. This not only ensures reproducibility of results but also facilitates flexible testing of different simulator parameters and allows for easy extension to further studies.



Based on the acquired knowledge by contributing to other open-source projects such as OPM Flow, then we have developed and made open the `pyopmspe11` tool which facilitates reproducible solutions to the SPE11 benchmark, which focus on GCS at different scales [@Nordbotten:2024]. A previous benchmark study for GCS can be found in @Class:2009. One key difference of the SPE11 benchmark from the benchmark in @Class:2009 is that no specific size and type of grids were given in the description, i.e., one of the main task for the SPE11 benchmark participants was to create suitable grids (e.g., structured grids such as Cartesian or unstructured grids such as corner-point grids) to run the cases, i.e., computational grids. To ease the comparison of results between participants, the SPE11 benchmark organizers requested the reporting of the spatial maps to follow a specific format using Cartesian grids with fixed cell sizes, i.e., reporting grids. The participants were encouraged to share data (e.g., input decks, code, submitted results), with the opportunity to store the data for open access. This is where developing tools that made all steps reproducible (i.e., preprocessing and postprocessing) become handy, and for this benchmark study, one available tool is `pyopmspe11`. Examples of pre-/postprocessing simulation tools which also have application in GCS and rely on the OPM Flow simulator include: `pyopmnearwell` [@Landa-Marbán:2023] and `expreccs` [@Landa-Marbán:2024]. The former focuses on near well dynamics, while the latter on seamless, dynamic, and non-invasive exchange of pressure-related information between local and regional scales.
Based on the acquired knowledge by contributing to other open-source projects such as OPM Flow, then we have developed and made open the `pyopmspe11` tool which facilitates reproducible solutions to the SPE11 benchmark, which focus on GCS at different scales [@Nordbotten:2024]. A previous benchmark study for GCS can be found in @Class:2009. One key difference of the SPE11 benchmark from the benchmark in @Class:2009 is that no specific size and type of grids were given in the description, i.e., one of the main task for the SPE11 benchmark participants was to create suitable grids (e.g., structured grids such as Cartesian or unstructured grids such as corner-point grids) and to run the cases, i.e., computational grids. To ease the comparison of results between participants, the SPE11 benchmark organizers requested the reporting of the spatial maps to follow a specific format using Cartesian grids with fixed cell sizes, i.e., reporting grids. The participants were encouraged to share data (e.g., input decks, code, submitted results), with the opportunity to store the data for open access. This is where developing tools that made all steps reproducible (i.e., preprocessing and postprocessing) became handy, and for this benchmark study, one available tool is `pyopmspe11`. Examples of pre-/postprocessing simulation tools which also have application in GCS and rely on the OPM Flow simulator include: `pyopmnearwell` [@Landa-Marbán:2023] and `expreccs` [@Landa-Marbán:2024]. The former focuses on near well dynamics, while the latter on seamless, dynamic, and non-invasive exchange of pressure-related information between local and regional scales.



Expand Down

0 comments on commit 6fcd7c4

Please sign in to comment.