Skip to content

Commit 10f1fab

Browse files
authored
recipes_source/torch_compile_backend_ipex.rst ๋ฒˆ์—ญ (#901)
1 parent b8d92ec commit 10f1fab

File tree

1 file changed

+24
-27
lines changed

1 file changed

+24
-27
lines changed

โ€Žrecipes_source/torch_compile_backend_ipex.rst

+24-27
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,20 @@
1-
Intelยฎ Extension for PyTorch* Backend
1+
Intelยฎ Extension for PyTorch* ๋ฐฑ์—”๋“œ
22
=====================================
33

4-
To work better with `torch.compile`, Intelยฎ Extension for PyTorch* implements a backend ``ipex``.
5-
It targets to improve hardware resource usage efficiency on Intel platforms for better performance.
6-
The `ipex` backend is implemented with further customizations designed in Intelยฎ Extension for
7-
PyTorch* for the model compilation.
4+
**์ €์ž**: `Hamid Shojanazeri <https://github.com/jingxu10>`_
5+
**๋ฒˆ์—ญ:**: `๊น€์žฌํ˜„ <https://github.com/jh941213>`_
86

9-
Usage Example
7+
- `torch.compile` ๊ณผ ๋” ์ž˜ ์ž‘๋™ํ•˜๋„๋ก, Intelยฎ Extension for PyTorch๋Š” ``ipex`` ๋ผ๋Š” ๋ฐฑ์—”๋“œ๋ฅผ ๊ตฌํ˜„ํ–ˆ์Šต๋‹ˆ๋‹ค.
8+
- ์ด ๋ฐฑ์—”๋“œ๋Š” Intel ํ”Œ๋žซํผ์—์„œ ํ•˜๋“œ์›จ์–ด ์ž์› ์‚ฌ์šฉ ํšจ์œจ์„ฑ์„ ๊ฐœ์„ ํ•˜์—ฌ ์„ฑ๋Šฅ์„ ํ–ฅ์ƒ์‹œํ‚ค๋Š” ๊ฒƒ์„ ๋ชฉํ‘œ๋กœ ํ•ฉ๋‹ˆ๋‹ค.
9+
- ๋ชจ๋ธ ์ปดํŒŒ์ผ์„ ์œ„ํ•œ Intelยฎ Extension for PyTorch์— ์„ค๊ณ„๋œ ์ถ”๊ฐ€ ์ปค์Šคํ„ฐ๋งˆ์ด์ง•์„ ํ†ตํ•ด, `ipex` ๋ฐฑ์—”๋“œ๊ฐ€ ๊ตฌํ˜„๋˜์—ˆ์Šต๋‹ˆ๋‹ค.
10+
11+
์‚ฌ์šฉ ์˜ˆ์‹œ
1012
~~~~~~~~~~~~~
1113

12-
Train FP32
14+
FP32 ํ•™์Šต
1315
----------
1416

15-
Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model training with FP32 data type.
16-
17+
์•„๋ž˜ ์˜ˆ์ œ๋ฅผ ํ†ตํ•ด, ์—ฌ๋Ÿฌ๋ถ„์€ FP32 ๋ฐ์ดํ„ฐ ํƒ€์ž…์œผ๋กœ ๋ชจ๋ธ์„ ํ•™์Šตํ•  ๋•Œ `torch.compile` ๊ณผ ํ•จ๊ป˜ `ipex` ๋ฐฑ์—”๋“œ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๋ฐฐ์šธ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
1718
.. code:: python
1819
1920
import torch
@@ -44,10 +45,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
4445
optimizer = torch.optim.SGD(model.parameters(), lr = LR, momentum=0.9)
4546
model.train()
4647
47-
#################### code changes ####################
48+
#################### ์ฝ”๋“œ ๋ณ€๊ฒฝ ๋ถ€๋ถ„ ####################
4849
import intel_extension_for_pytorch as ipex
49-
50-
# Invoke the following API optionally, to apply frontend optimizations
50+
# ์„ ํƒ์ ์œผ๋กœ ๋‹ค์Œ API๋ฅผ ํ˜ธ์ถœํ•˜์—ฌ, ํ”„๋ก ํŠธ์—”๋“œ ์ตœ์ ํ™”๋ฅผ ์ ์šฉํ•ฉ๋‹ˆ๋‹ค.
5151
model, optimizer = ipex.optimize(model, optimizer=optimizer)
5252
5353
compile_model = torch.compile(model, backend="ipex")
@@ -61,10 +61,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
6161
optimizer.step()
6262
6363
64-
Train BF16
64+
BF16 ํ•™์Šต
6565
----------
6666

67-
Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model training with BFloat16 data type.
67+
์•„๋ž˜ ์˜ˆ์‹œ๋ฅผ ํ†ตํ•ด BFloat16 ๋ฐ์ดํ„ฐ ํƒ€์ž…์œผ๋กœ ๋ชจ๋ธ ํ•™์Šต์„ ์œ„ํ•ด `torch.compile` ์™€ ํ•จ๊ป˜ `ipex` ๋ฐฑ์—”๋“œ๋ฅผ ํ™œ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์•„๋ณด์„ธ์š”.
6868

6969
.. code:: python
7070
@@ -96,10 +96,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
9696
optimizer = torch.optim.SGD(model.parameters(), lr = LR, momentum=0.9)
9797
model.train()
9898
99-
#################### code changes ####################
99+
#################### ์ฝ”๋“œ ๋ณ€๊ฒฝ ๋ถ€๋ถ„ ####################
100100
import intel_extension_for_pytorch as ipex
101-
102-
# Invoke the following API optionally, to apply frontend optimizations
101+
# ์„ ํƒ์ ์œผ๋กœ ๋‹ค์Œ API๋ฅผ ํ˜ธ์ถœํ•˜์—ฌ, ํ”„๋ก ํŠธ์—”๋“œ ์ตœ์ ํ™”๋ฅผ ์ ์šฉํ•ฉ๋‹ˆ๋‹ค.
103102
model, optimizer = ipex.optimize(model, dtype=torch.bfloat16, optimizer=optimizer)
104103
105104
compile_model = torch.compile(model, backend="ipex")
@@ -114,10 +113,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
114113
optimizer.step()
115114
116115
117-
Inference FP32
116+
FP32 ์ถ”๋ก 
118117
--------------
119118

120-
Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model inference with FP32 data type.
119+
์•„๋ž˜ ์˜ˆ์‹œ๋ฅผ ํ†ตํ•ด `ipex` ๋ฐฑ์—”๋“œ๋ฅผ `torch.compile` ์™€ ํ•จ๊ป˜ ํ™œ์šฉํ•˜์—ฌ FP32 ๋ฐ์ดํ„ฐ ํƒ€์ž…์œผ๋กœ ๋ชจ๋ธ์„ ์ถ”๋ก ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์•„๋ณด์„ธ์š”.
121120

122121
.. code:: python
123122
@@ -128,10 +127,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
128127
model.eval()
129128
data = torch.rand(1, 3, 224, 224)
130129
131-
#################### code changes ####################
130+
#################### ์ฝ”๋“œ ๋ณ€๊ฒฝ ๋ถ€๋ถ„ ####################
132131
import intel_extension_for_pytorch as ipex
133-
134-
# Invoke the following API optionally, to apply frontend optimizations
132+
# ์„ ํƒ์ ์œผ๋กœ ๋‹ค์Œ API๋ฅผ ํ˜ธ์ถœํ•˜์—ฌ, ํ”„๋ก ํŠธ์—”๋“œ ์ตœ์ ํ™”๋ฅผ ์ ์šฉํ•ฉ๋‹ˆ๋‹ค.
135133
model = ipex.optimize(model, weights_prepack=False)
136134
137135
compile_model = torch.compile(model, backend="ipex")
@@ -141,10 +139,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
141139
compile_model(data)
142140
143141
144-
Inference BF16
142+
BF16 ์ถ”๋ก 
145143
--------------
146144

147-
Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model inference with BFloat16 data type.
145+
์•„๋ž˜ ์˜ˆ์‹œ๋ฅผ ํ†ตํ•ด `ipex` ๋ฐฑ์—”๋“œ๋ฅผ `torch.compile`์™€ ํ•จ๊ป˜ ํ™œ์šฉํ•˜์—ฌ BFloat16 ๋ฐ์ดํ„ฐ ํƒ€์ž…์œผ๋กœ ๋ชจ๋ธ์„ ์ถ”๋ก ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์•„๋ณด์„ธ์š”.
148146

149147
.. code:: python
150148
@@ -155,10 +153,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
155153
model.eval()
156154
data = torch.rand(1, 3, 224, 224)
157155
158-
#################### code changes ####################
156+
#################### ์ฝ”๋“œ ๋ณ€๊ฒฝ ๋ถ€๋ถ„ ####################
159157
import intel_extension_for_pytorch as ipex
160-
161-
# Invoke the following API optionally, to apply frontend optimizations
158+
# ์„ ํƒ์ ์œผ๋กœ ๋‹ค์Œ API๋ฅผ ํ˜ธ์ถœํ•˜์—ฌ, ํ”„๋ก ํŠธ์—”๋“œ ์ตœ์ ํ™”๋ฅผ ์ ์šฉํ•ฉ๋‹ˆ๋‹ค.
162159
model = ipex.optimize(model, dtype=torch.bfloat16, weights_prepack=False)
163160
164161
compile_model = torch.compile(model, backend="ipex")

0 commit comments

Comments
ย (0)