Skip to content

Commit cc88e64

Browse files
authored
chore: update image links and weight (#66)
1 parent cf5cbd1 commit cc88e64

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

tutorials/advanced/01-collective-communication.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@
4343
"source": [
4444
"<!-- ---\n",
4545
"title: Collective Communication with Ignite\n",
46-
"weight: 5\n",
46+
"weight: 1\n",
4747
"date: 2021-10-5\n",
4848
"downloads: true\n",
4949
"sidebar: true\n",
@@ -122,7 +122,7 @@
122122
"source": [
123123
"## All Reduce\n",
124124
"\n",
125-
"![All Reduce Diagram](https://github.com/pytorch-ignite/examples/blob/add-collective-comm-nb/tutorials/assets/all-reduce.png?raw=1)\n",
125+
"![All Reduce Diagram](https://github.com/pytorch-ignite/examples/blob/main/tutorials/assets/all-reduce.png?raw=1)\n",
126126
"\n",
127127
"The [`all_reduce()`](https://pytorch.org/ignite/distributed.html#ignite.distributed.utils.all_reduce) method is used to collect specified tensors from each process and make them available on every node then perform a specified operation (sum, product, min, max, etc) on them. Let's spawn 3 processes with ranks 0, 1 and 2 and define a `tensor` on all of them. If we performed `all_reduce` with the operation SUM on `tensor` then `tensor` on all ranks will be gathered, added and stored in `tensor` as shown below:"
128128
]
@@ -224,7 +224,7 @@
224224
"source": [
225225
"## All Gather\n",
226226
"\n",
227-
"![All Gather Diagram](https://github.com/pytorch-ignite/examples/blob/add-collective-comm-nb/tutorials/assets/all-gather.png?raw=1)\n",
227+
"![All Gather Diagram](https://github.com/pytorch-ignite/examples/blob/main/tutorials/assets/all-gather.png?raw=1)\n",
228228
"\n",
229229
"The [`all_gather()`](https://pytorch.org/ignite/distributed.html#ignite.distributed.utils.all_gather) method is used when you just want to collect a tensor, number or string across all participating processes. As a basic example, suppose you have to collect all the different values stored in `num` on all ranks. You can achieve this by using `all_gather` as below:"
230230
]
@@ -315,7 +315,7 @@
315315
"source": [
316316
"## Broadcast\n",
317317
"\n",
318-
"![Broadcast Diagram](https://github.com/pytorch-ignite/examples/blob/add-collective-comm-nb/tutorials/assets/broadcast.png?raw=1)\n",
318+
"![Broadcast Diagram](https://github.com/pytorch-ignite/examples/blob/main/tutorials/assets/broadcast.png?raw=1)\n",
319319
"\n",
320320
"The [`broadcast()`](https://pytorch.org/ignite/distributed.html#ignite.distributed.utils.broadcast) method copies a tensor, float or string from a source process to all the other processes. For example, you need to send a message from rank 0 to all other ranks. You can do this by creating the actual message on rank 0 and a placeholder on all other ranks, then broadcast the message mentioning a source rank. You can also use `safe_mode=True` in case the placeholder is not defined on all ranks. "
321321
]
@@ -427,4 +427,4 @@
427427
"outputs": []
428428
}
429429
]
430-
}
430+
}

0 commit comments

Comments
 (0)