Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

torch_compile_backend_ipex.rst 번역 #901

Merged
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 24 additions & 27 deletions recipes_source/torch_compile_backend_ipex.rst
Original file line number Diff line number Diff line change
@@ -1,19 +1,20 @@
Intel® Extension for PyTorch* Backend
Intel® Extension for PyTorch* 백엔드
=====================================

To work better with `torch.compile`, Intel® Extension for PyTorch* implements a backend ``ipex``.
It targets to improve hardware resource usage efficiency on Intel platforms for better performance.
The `ipex` backend is implemented with further customizations designed in Intel® Extension for
PyTorch* for the model compilation.
**저자**: `Hamid Shojanazeri <https://github.com/jingxu10>`_
**번역:**: `김재현 <https://github.com/jh941213>`_

Usage Example
- `torch.compile` 과 더 잘 작동하도록, Intel® Extension for PyTorch는 ``ipex`` 라는 백엔드를 구현했습니다.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

역자 정보 추가 부탁드립니다.

- 이 백엔드는 Intel 플랫폼에서 하드웨어 자원 사용 효율성을 개선하여 성능을 향상시키는 것을 목표로 합니다.
- 모델 컴파일을 위한 Intel® Extension for PyTorch에 설계된 추가 커스터마이징을 통해, `ipex` 백엔드가 구현되었습니다.

사용 예시
~~~~~~~~~~~~~

Train FP32
FP32 학습
----------

Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model training with FP32 data type.

아래 예제를 통해, 여러분은 FP32 데이터 타입으로 모델을 학습할 때 `torch.compile` 과 함께 `ipex` 백엔드를 사용하는 방법을 배울 수 있습니다.
.. code:: python

import torch
Expand Down Expand Up @@ -44,10 +45,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
optimizer = torch.optim.SGD(model.parameters(), lr = LR, momentum=0.9)
model.train()

#################### code changes ####################
#################### 코드 변경 부분 ####################
import intel_extension_for_pytorch as ipex

# Invoke the following API optionally, to apply frontend optimizations
# 선택적으로 다음 API를 호출하여, 프론트엔드 최적화를 적용합니다.
model, optimizer = ipex.optimize(model, optimizer=optimizer)

compile_model = torch.compile(model, backend="ipex")
Expand All @@ -61,10 +61,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
optimizer.step()


Train BF16
BF16 학습
----------

Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model training with BFloat16 data type.
아래 예시를 통해 BFloat16 데이터 타입으로 모델 학습을 위해 `torch.compile` 와 함께 `ipex` 백엔드를 활용하는 방법을 알아보세요.

.. code:: python

Expand Down Expand Up @@ -96,10 +96,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
optimizer = torch.optim.SGD(model.parameters(), lr = LR, momentum=0.9)
model.train()

#################### code changes ####################
#################### 코드 변경 부분 ####################
import intel_extension_for_pytorch as ipex

# Invoke the following API optionally, to apply frontend optimizations
# 선택적으로 다음 API를 호출하여, 프론트엔드 최적화를 적용합니다.
model, optimizer = ipex.optimize(model, dtype=torch.bfloat16, optimizer=optimizer)

compile_model = torch.compile(model, backend="ipex")
Expand All @@ -114,10 +113,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
optimizer.step()


Inference FP32
FP32 추론
--------------

Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model inference with FP32 data type.
아래 예시를 통해 `ipex` 백엔드를 `torch.compile` 와 함께 활용하여 FP32 데이터 타입으로 모델을 추론하는 방법을 알아보세요.

.. code:: python

Expand All @@ -128,10 +127,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
model.eval()
data = torch.rand(1, 3, 224, 224)

#################### code changes ####################
#################### 코드 변경 부분 ####################
import intel_extension_for_pytorch as ipex

# Invoke the following API optionally, to apply frontend optimizations
# 선택적으로 다음 API를 호출하여, 프론트엔드 최적화를 적용합니다.
model = ipex.optimize(model, weights_prepack=False)

compile_model = torch.compile(model, backend="ipex")
Expand All @@ -141,10 +139,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
compile_model(data)


Inference BF16
BF16 추론
--------------

Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model inference with BFloat16 data type.
아래 예시를 통해 `ipex` 백엔드를 `torch.compile`와 함께 활용하여 BFloat16 데이터 타입으로 모델을 추론하는 방법을 알아보세요.

.. code:: python

Expand All @@ -155,10 +153,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
model.eval()
data = torch.rand(1, 3, 224, 224)

#################### code changes ####################
#################### 코드 변경 부분 ####################
import intel_extension_for_pytorch as ipex

# Invoke the following API optionally, to apply frontend optimizations
# 선택적으로 다음 API를 호출하여, 프론트엔드 최적화를 적용합니다.
model = ipex.optimize(model, dtype=torch.bfloat16, weights_prepack=False)

compile_model = torch.compile(model, backend="ipex")
Expand Down