1
- Intelยฎ Extension for PyTorch* Backend
1
+ Intelยฎ Extension for PyTorch* ๋ฐฑ์๋
2
2
=====================================
3
3
4
- To work better with `torch.compile `, Intelยฎ Extension for PyTorch* implements a backend ``ipex ``.
5
- It targets to improve hardware resource usage efficiency on Intel platforms for better performance.
6
- The `ipex ` backend is implemented with further customizations designed in Intelยฎ Extension for
7
- PyTorch* for the model compilation.
4
+ **์ ์ **: `Hamid Shojanazeri <https://github.com/jingxu10 >`_
5
+ **๋ฒ์ญ: **: `๊น์ฌํ <https://github.com/jh941213 >`_
8
6
9
- Usage Example
7
+ - `torch.compile ` ๊ณผ ๋ ์ ์๋ํ๋๋ก, Intelยฎ Extension for PyTorch๋ ``ipex `` ๋ผ๋ ๋ฐฑ์๋๋ฅผ ๊ตฌํํ์ต๋๋ค.
8
+ - ์ด ๋ฐฑ์๋๋ Intel ํ๋ซํผ์์ ํ๋์จ์ด ์์ ์ฌ์ฉ ํจ์จ์ฑ์ ๊ฐ์ ํ์ฌ ์ฑ๋ฅ์ ํฅ์์ํค๋ ๊ฒ์ ๋ชฉํ๋ก ํฉ๋๋ค.
9
+ - ๋ชจ๋ธ ์ปดํ์ผ์ ์ํ Intelยฎ Extension for PyTorch์ ์ค๊ณ๋ ์ถ๊ฐ ์ปค์คํฐ๋ง์ด์ง์ ํตํด, `ipex ` ๋ฐฑ์๋๊ฐ ๊ตฌํ๋์์ต๋๋ค.
10
+
11
+ ์ฌ์ฉ ์์
10
12
~~~~~~~~~~~~~
11
13
12
- Train FP32
14
+ FP32 ํ์ต
13
15
----------
14
16
15
- Check the example below to learn how to utilize the `ipex ` backend with `torch.compile ` for model training with FP32 data type.
16
-
17
+ ์๋ ์์ ๋ฅผ ํตํด, ์ฌ๋ฌ๋ถ์ FP32 ๋ฐ์ดํฐ ํ์
์ผ๋ก ๋ชจ๋ธ์ ํ์ตํ ๋ `torch.compile ` ๊ณผ ํจ๊ป `ipex ` ๋ฐฑ์๋๋ฅผ ์ฌ์ฉํ๋ ๋ฐฉ๋ฒ์ ๋ฐฐ์ธ ์ ์์ต๋๋ค.
17
18
.. code :: python
18
19
19
20
import torch
@@ -44,10 +45,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
44
45
optimizer = torch.optim.SGD(model.parameters(), lr = LR , momentum = 0.9 )
45
46
model.train()
46
47
47
- # ################### code changes ####################
48
+ # ################### ์ฝ๋ ๋ณ๊ฒฝ ๋ถ๋ถ ####################
48
49
import intel_extension_for_pytorch as ipex
49
-
50
- # Invoke the following API optionally, to apply frontend optimizations
50
+ # ์ ํ์ ์ผ๋ก ๋ค์ API๋ฅผ ํธ์ถํ์ฌ, ํ๋ก ํธ์๋ ์ต์ ํ๋ฅผ ์ ์ฉํฉ๋๋ค.
51
51
model, optimizer = ipex.optimize(model, optimizer = optimizer)
52
52
53
53
compile_model = torch.compile(model, backend = " ipex" )
@@ -61,10 +61,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
61
61
optimizer.step()
62
62
63
63
64
- Train BF16
64
+ BF16 ํ์ต
65
65
----------
66
66
67
- Check the example below to learn how to utilize the ` ipex ` backend with ` torch.compile ` for model training with BFloat16 data type .
67
+ ์๋ ์์๋ฅผ ํตํด BFloat16 ๋ฐ์ดํฐ ํ์
์ผ๋ก ๋ชจ๋ธ ํ์ต์ ์ํด ` torch.compile ` ์ ํจ๊ป ` ipex ` ๋ฐฑ์๋๋ฅผ ํ์ฉํ๋ ๋ฐฉ๋ฒ์ ์์๋ณด์ธ์ .
68
68
69
69
.. code :: python
70
70
@@ -96,10 +96,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
96
96
optimizer = torch.optim.SGD(model.parameters(), lr = LR , momentum = 0.9 )
97
97
model.train()
98
98
99
- # ################### code changes ####################
99
+ # ################### ์ฝ๋ ๋ณ๊ฒฝ ๋ถ๋ถ ####################
100
100
import intel_extension_for_pytorch as ipex
101
-
102
- # Invoke the following API optionally, to apply frontend optimizations
101
+ # ์ ํ์ ์ผ๋ก ๋ค์ API๋ฅผ ํธ์ถํ์ฌ, ํ๋ก ํธ์๋ ์ต์ ํ๋ฅผ ์ ์ฉํฉ๋๋ค.
103
102
model, optimizer = ipex.optimize(model, dtype = torch.bfloat16, optimizer = optimizer)
104
103
105
104
compile_model = torch.compile(model, backend = " ipex" )
@@ -114,10 +113,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
114
113
optimizer.step()
115
114
116
115
117
- Inference FP32
116
+ FP32 ์ถ๋ก
118
117
--------------
119
118
120
- Check the example below to learn how to utilize the `ipex ` backend with `torch.compile ` for model inference with FP32 data type .
119
+ ์๋ ์์๋ฅผ ํตํด `ipex ` ๋ฐฑ์๋๋ฅผ `torch.compile ` ์ ํจ๊ป ํ์ฉํ์ฌ FP32 ๋ฐ์ดํฐ ํ์
์ผ๋ก ๋ชจ๋ธ์ ์ถ๋ก ํ๋ ๋ฐฉ๋ฒ์ ์์๋ณด์ธ์ .
121
120
122
121
.. code :: python
123
122
@@ -128,10 +127,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
128
127
model.eval()
129
128
data = torch.rand(1 , 3 , 224 , 224 )
130
129
131
- # ################### code changes ####################
130
+ # ################### ์ฝ๋ ๋ณ๊ฒฝ ๋ถ๋ถ ####################
132
131
import intel_extension_for_pytorch as ipex
133
-
134
- # Invoke the following API optionally, to apply frontend optimizations
132
+ # ์ ํ์ ์ผ๋ก ๋ค์ API๋ฅผ ํธ์ถํ์ฌ, ํ๋ก ํธ์๋ ์ต์ ํ๋ฅผ ์ ์ฉํฉ๋๋ค.
135
133
model = ipex.optimize(model, weights_prepack = False )
136
134
137
135
compile_model = torch.compile(model, backend = " ipex" )
@@ -141,10 +139,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
141
139
compile_model(data)
142
140
143
141
144
- Inference BF16
142
+ BF16 ์ถ๋ก
145
143
--------------
146
144
147
- Check the example below to learn how to utilize the `ipex ` backend with `torch.compile ` for model inference with BFloat16 data type .
145
+ ์๋ ์์๋ฅผ ํตํด `ipex ` ๋ฐฑ์๋๋ฅผ `torch.compile`์ ํจ๊ป ํ์ฉํ์ฌ BFloat16 ๋ฐ์ดํฐ ํ์
์ผ๋ก ๋ชจ๋ธ์ ์ถ๋ก ํ๋ ๋ฐฉ๋ฒ์ ์์๋ณด์ธ์ .
148
146
149
147
.. code :: python
150
148
@@ -155,10 +153,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
155
153
model.eval()
156
154
data = torch.rand(1 , 3 , 224 , 224 )
157
155
158
- # ################### code changes ####################
156
+ # ################### ์ฝ๋ ๋ณ๊ฒฝ ๋ถ๋ถ ####################
159
157
import intel_extension_for_pytorch as ipex
160
-
161
- # Invoke the following API optionally, to apply frontend optimizations
158
+ # ์ ํ์ ์ผ๋ก ๋ค์ API๋ฅผ ํธ์ถํ์ฌ, ํ๋ก ํธ์๋ ์ต์ ํ๋ฅผ ์ ์ฉํฉ๋๋ค.
162
159
model = ipex.optimize(model, dtype = torch.bfloat16, weights_prepack = False )
163
160
164
161
compile_model = torch.compile(model, backend = " ipex" )
0 commit comments