You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
### Checks
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [ ] Avoid including large-size files in the PR.
- [ ] Clean up long text outputs from code cells in the notebook.
- [ ] For security purposes, please check the contents and remove any
sensitive info such as user names and private key.
- [ ] Ensure (1) hyperlinks and markdown anchors are working (2) use
relative paths for tutorial repo files (3) put figure and graphs in the
`./figure` folder
- [ ] Notebook runs automatically `./runner.sh -t <path to .ipynb file>`
Signed-off-by: Wenqi Li <[email protected]>
Copy file name to clipboardExpand all lines: nnunet/README.md
+10-10Lines changed: 10 additions & 10 deletions
Original file line number
Diff line number
Diff line change
@@ -10,19 +10,19 @@ Overall, the integration between nnU-Net and MONAI can offer significant benefit
10
10
11
11
## What's New in nnU-Net V2
12
12
13
-
nnU-Net has release a newer version, nnU-Net V2, recently. Some changes have been made as follows.
13
+
nnU-Net has released a newer version, nnU-Net V2, recently. Some changes have been made as follows.
14
14
- Refactored repository: nnU-Net v2 has undergone significant changes in the repository structure, making it easier to navigate and understand. The codebase has been modularized, and the documentation has been improved, allowing for easier integration with other tools and frameworks.
15
15
- New features: nnU-Net v2 has introduced several new [features](https://github.com/MIC-DKFZ/nnUNet/blob/master/documentation/changelog.md), including:
16
-
- Regionbased formulation with sigmoid activation;
16
+
- Region-based formulation with sigmoid activation;
17
17
- Cross-platform support;
18
18
- Multi-GPU training support.
19
19
20
20
Overall, nnU-Net v2 has introduced significant improvements and new features, making it a powerful and flexible deep learning framework for medical image segmentation. With its easy-to-use interface, modularized codebase, and advanced features, nnU-Net v2 is poised to advance the field of medical image analysis and improve patient outcomes.
21
21
22
-
## How does the integration works?
22
+
## How does the integration work?
23
23
As part of the integration, we have introduced a new class called the `nnUNetV2Runner`, which utilizes the Python APIs available in the official nnU-Net repository. The `nnUNetV2Runner` provides several key features that are useful for general users of MONAI.
24
24
- The new class offers Python APIs at a high level to facilitate most of the components in nnU-Net, such as model training, validation, ensemble;
25
-
- Users are only required to provide the minimum input, as specified in most of the MONAI tutorials for 3D medical image segmentation. The new class will automatically handle data conversion to prepare data that meets the requirements of nnU-Net, which will largely save time for users to prepare the datatsets;
25
+
- Users are only required to provide the minimum input, as specified in most of the MONAI tutorials for 3D medical image segmentation. The new class will automatically handle data conversion to prepare data that meets the requirements of nnU-Net, which will largely save time for users to prepare the datasets;
26
26
- Additionally, we have enabled users with more GPU resources to automatically allocate model training jobs in parallel. As nnU-Net requires the training of 20 segmentation models by default, distributing model training to larger resources can significantly improve overall efficiency. For instance, users with 8 GPUs can increase model training speed by 6x to 8x automatically using the new class.
27
27
28
28
## Benchmarking Results on Public Datasets
@@ -34,7 +34,7 @@ In this session, we present the results of our `nnUNetV2Runner` and results from
34
34
1.[BraTS21](http://braintumorsegmentation.org/): The RSNA-ASNR-MICCAI BraTS 2021 Challenge utilizes multi-institutional preoperative baseline multiparametric magnetic resonance imaging (mpMRI) scans and focuses on evaluating (task 1) state-of-the-art methods for segmentation of intrinsically heterogeneous brain glioblasts in mpMRI scans Tumor subregion.
35
35
2.[AMOS22](https://amos22.grand-challenge.org/): Task 1 focuses on the segmentation of abdominal organs using CT scans. The goal is to evaluate the performance of different segmentation methods on a diverse set of 500 cases, with annotations for 15 organs. Task 2 extends the scope of Task 1 by including MRI scans in addition to CT scans. Under this “Cross Modality” setting, a single algorithm must segment abdominal organs from both CT and MRI scans. This task provides an additional 100 MRI scans with the same type of annotation.
36
36
37
-
The table below shows the results of full-resolution 3D U-Net on fold 0 for each dataset. We can see that the performance of `nnUNetV2Runner` meets expectation.
37
+
The table below shows the results of full-resolution 3D U-Net on fold 0 for each dataset. We can see that the performance of `nnUNetV2Runner` meets expectations.
@@ -46,19 +46,19 @@ The table below shows the results of full-resolution 3D U-Net on fold 0 for each
46
46
47
47
### 1. nnU-Net v2 installation
48
48
49
-
THe installation instruction is described [here](docs/install.md).
49
+
The installation instruction is described [here](docs/install.md).
50
50
51
51
### 2. Run with Minimal Input using ```nnUNetV2Runner```
52
52
53
-
User needs to provide a data list (".json" file) for the new task and data root. In general, a valid data list needs to follow the format of ones in [Medical Segmentaiton Decathlon](https://drive.google.com/drive/folders/1HqEgzS8BV2c7xYNrZdEAnrHk7osJJ--2). After creating the data list, the user can create a simple "input.yaml" file (shown below) as the minimum input for **nnUNetV2Runner**.
53
+
The user needs to provide a data list (".json" file) for the new task and data root. In general, a valid data list needs to follow the format of the ones in [Medical Segmentation Decathlon](https://drive.google.com/drive/folders/1HqEgzS8BV2c7xYNrZdEAnrHk7osJJ--2). After creating the data list, the user can create a simple "input.yaml" file (shown below) as the minimum input for **nnUNetV2Runner**.
54
54
55
55
```
56
56
modality: CT
57
57
datalist: "./msd_task09_spleen_folds.json"
58
58
dataroot: "/workspace/data/nnunet_test/test09"
59
59
```
60
60
61
-
User can also set values of directory variables as options in "input.yaml" if any directory needs to be specified.
61
+
Users can also set values of directory variables as options in "input.yaml" if any directory needs to be specified.
@@ -74,7 +74,7 @@ python -m monai.apps.nnunet nnUNetV2Runner run --input_config='./input.yaml'
74
74
75
75
### 2. Run nnU-Net modules using ```nnUNetV2Runner```
76
76
77
-
```nnUNetV2Runner``` offers the one-stop API to execute the pipeline, as well as the APIs to access the underlying components of nnU-Net V2. Below are the command for different components.
77
+
```nnUNetV2Runner``` offers the one-stop API to execute the pipeline, as well as the APIs to access the underlying components of nnU-Net V2. Below is the command for different components.
0 commit comments