Skip to content

Commit 96b1a14

Browse files
committed
Merge: [TSPP/PyT] Updated quick start guide
2 parents 81b9010 + c93f225 commit 96b1a14

File tree

1 file changed

+12
-8
lines changed
  • Tools/PyTorch/TimeSeriesPredictionPlatform

1 file changed

+12
-8
lines changed

Tools/PyTorch/TimeSeriesPredictionPlatform/README.md

Lines changed: 12 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -95,41 +95,45 @@ For those unable to set up the required environment or create your own container
9595
## Quick start guide
9696

9797
### Getting Started
98-
1. Create a dataset directory. The directory can be arbitrary, and it is recommended not to include it in the TimeSeriesPredictionPlatform directory. This arbitrary directory will be mounted to the TSPP container later. In the following steps this directory will be referred to as /your/datasets/.
98+
1. Clone the NVIDIA Deep Learning Examples repository:
99+
```
100+
git clone https://github.com/NVIDIA/DeepLearningExamples.git
101+
```
102+
2. Create a dataset directory. The directory can be arbitrary, and it is recommended not to include it in the TimeSeriesPredictionPlatform directory. This arbitrary directory will be mounted to the TSPP container later. In the following steps this directory will be referred to as /your/datasets/.
99103

100-
2. Enter the Deep Learning Examples TSPP repository:
104+
3. Enter the Deep Learning Examples TSPP repository:
101105

102106
```
103107
cd DeeplearningExamples/Tools/PyTorch/TimeSeriesPredictionPlatform
104108
```
105-
3. Run repository setup
109+
4. Run repository setup
106110
```
107111
source scripts/setup.sh
108112
```
109113

110-
3. Build the docker image:
114+
5. Build the docker image:
111115
```
112116
docker build -t tspp .
113117
```
114118

115-
4. Next we will start our container and mount the dataset directory, which means that /workspace/datasets/ points to /your/datasets/. Any changes made to this folder in the docker container are reflected in the original directory and vice versa. If we want to mount additional folders we can add ‘-v /path/on/local/:/path/in/container/’ to the run command. This will be useful if we want to save the outputs from training or inference once we close the container. To start the docker container:
119+
6. Next we will start our container and mount the dataset directory, which means that /workspace/datasets/ points to /your/datasets/. Any changes made to this folder in the docker container are reflected in the original directory and vice versa. If we want to mount additional folders we can add ‘-v /path/on/local/:/path/in/container/’ to the run command. This will be useful if we want to save the outputs from training or inference once we close the container. To start the docker container:
116120
```
117121
docker run -it --gpus all --ipc=host --network=host -v /your/datasets/:/workspace/datasets/ tspp bash
118122
```
119123

120-
5. After running the previous command you will be placed inside the docker container in the /workspace directory. Inside the container, download either the electricity or traffic dataset:
124+
7. After running the previous command you will be placed inside the docker container in the /workspace directory. Inside the container, download either the electricity or traffic dataset:
121125
```
122126
python data/script_download_data.py --dataset {dataset_name} --output_dir /workspace/datasets/
123127
```
124128
The raw electricity dataset is the 15 minute electricity consumption of 370 customers from the UCI Electricity Load Diagrams. We aggregate to an hourly forecast and use the previous week to predict the following day.
125129
The raw traffic dataset is the 10 minute occupancy rate of San Francisco freeways from 440 sensors downloaded from the UCI PEMS-SF Data Set. We again aggregate to an hourly forecast and use the previous week to predict the following day.
126130

127-
6. Preprocess the dataset:
131+
8. Preprocess the dataset:
128132
```
129133
python launch_preproc.py dataset={dataset}
130134
```
131135

132-
7. Launch the training, validation, and testing process using the temporal fusion transformer model:
136+
9. Launch the training, validation, and testing process using the temporal fusion transformer model:
133137
```
134138
python launch_tspp.py model=tft dataset={dataset} criterion=quantile
135139
```

0 commit comments

Comments
 (0)