Skip to content

Commit

Permalink
recipes_source/torch_compile_backend_ipex.rst ๋ฒˆ์—ญ (#901)
Browse files Browse the repository at this point in the history
  • Loading branch information
jh941213 authored Oct 15, 2024
1 parent b8d92ec commit 10f1fab
Showing 1 changed file with 24 additions and 27 deletions.
51 changes: 24 additions & 27 deletions recipes_source/torch_compile_backend_ipex.rst
Original file line number Diff line number Diff line change
@@ -1,19 +1,20 @@
Intelยฎ Extension for PyTorch* Backend
Intelยฎ Extension for PyTorch* ๋ฐฑ์—”๋“œ
=====================================

To work better with `torch.compile`, Intelยฎ Extension for PyTorch* implements a backend ``ipex``.
It targets to improve hardware resource usage efficiency on Intel platforms for better performance.
The `ipex` backend is implemented with further customizations designed in Intelยฎ Extension for
PyTorch* for the model compilation.
**์ €์ž**: `Hamid Shojanazeri <https://github.com/jingxu10>`_
**๋ฒˆ์—ญ:**: `๊น€์žฌํ˜„ <https://github.com/jh941213>`_

Usage Example
- `torch.compile` ๊ณผ ๋” ์ž˜ ์ž‘๋™ํ•˜๋„๋ก, Intelยฎ Extension for PyTorch๋Š” ``ipex`` ๋ผ๋Š” ๋ฐฑ์—”๋“œ๋ฅผ ๊ตฌํ˜„ํ–ˆ์Šต๋‹ˆ๋‹ค.
- ์ด ๋ฐฑ์—”๋“œ๋Š” Intel ํ”Œ๋žซํผ์—์„œ ํ•˜๋“œ์›จ์–ด ์ž์› ์‚ฌ์šฉ ํšจ์œจ์„ฑ์„ ๊ฐœ์„ ํ•˜์—ฌ ์„ฑ๋Šฅ์„ ํ–ฅ์ƒ์‹œํ‚ค๋Š” ๊ฒƒ์„ ๋ชฉํ‘œ๋กœ ํ•ฉ๋‹ˆ๋‹ค.
- ๋ชจ๋ธ ์ปดํŒŒ์ผ์„ ์œ„ํ•œ Intelยฎ Extension for PyTorch์— ์„ค๊ณ„๋œ ์ถ”๊ฐ€ ์ปค์Šคํ„ฐ๋งˆ์ด์ง•์„ ํ†ตํ•ด, `ipex` ๋ฐฑ์—”๋“œ๊ฐ€ ๊ตฌํ˜„๋˜์—ˆ์Šต๋‹ˆ๋‹ค.

์‚ฌ์šฉ ์˜ˆ์‹œ
~~~~~~~~~~~~~

Train FP32
FP32 ํ•™์Šต
----------

Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model training with FP32 data type.

์•„๋ž˜ ์˜ˆ์ œ๋ฅผ ํ†ตํ•ด, ์—ฌ๋Ÿฌ๋ถ„์€ FP32 ๋ฐ์ดํ„ฐ ํƒ€์ž…์œผ๋กœ ๋ชจ๋ธ์„ ํ•™์Šตํ•  ๋•Œ `torch.compile` ๊ณผ ํ•จ๊ป˜ `ipex` ๋ฐฑ์—”๋“œ๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๋ฐฐ์šธ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
.. code:: python
import torch
Expand Down Expand Up @@ -44,10 +45,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
optimizer = torch.optim.SGD(model.parameters(), lr = LR, momentum=0.9)
model.train()
#################### code changes ####################
#################### ์ฝ”๋“œ ๋ณ€๊ฒฝ ๋ถ€๋ถ„ ####################
import intel_extension_for_pytorch as ipex
# Invoke the following API optionally, to apply frontend optimizations
# ์„ ํƒ์ ์œผ๋กœ ๋‹ค์Œ API๋ฅผ ํ˜ธ์ถœํ•˜์—ฌ, ํ”„๋ก ํŠธ์—”๋“œ ์ตœ์ ํ™”๋ฅผ ์ ์šฉํ•ฉ๋‹ˆ๋‹ค.
model, optimizer = ipex.optimize(model, optimizer=optimizer)
compile_model = torch.compile(model, backend="ipex")
Expand All @@ -61,10 +61,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
optimizer.step()
Train BF16
BF16 ํ•™์Šต
----------

Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model training with BFloat16 data type.
์•„๋ž˜ ์˜ˆ์‹œ๋ฅผ ํ†ตํ•ด BFloat16 ๋ฐ์ดํ„ฐ ํƒ€์ž…์œผ๋กœ ๋ชจ๋ธ ํ•™์Šต์„ ์œ„ํ•ด `torch.compile` ์™€ ํ•จ๊ป˜ `ipex` ๋ฐฑ์—”๋“œ๋ฅผ ํ™œ์šฉํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์•„๋ณด์„ธ์š”.

.. code:: python
Expand Down Expand Up @@ -96,10 +96,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
optimizer = torch.optim.SGD(model.parameters(), lr = LR, momentum=0.9)
model.train()
#################### code changes ####################
#################### ์ฝ”๋“œ ๋ณ€๊ฒฝ ๋ถ€๋ถ„ ####################
import intel_extension_for_pytorch as ipex
# Invoke the following API optionally, to apply frontend optimizations
# ์„ ํƒ์ ์œผ๋กœ ๋‹ค์Œ API๋ฅผ ํ˜ธ์ถœํ•˜์—ฌ, ํ”„๋ก ํŠธ์—”๋“œ ์ตœ์ ํ™”๋ฅผ ์ ์šฉํ•ฉ๋‹ˆ๋‹ค.
model, optimizer = ipex.optimize(model, dtype=torch.bfloat16, optimizer=optimizer)
compile_model = torch.compile(model, backend="ipex")
Expand All @@ -114,10 +113,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
optimizer.step()
Inference FP32
FP32 ์ถ”๋ก 
--------------

Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model inference with FP32 data type.
์•„๋ž˜ ์˜ˆ์‹œ๋ฅผ ํ†ตํ•ด `ipex` ๋ฐฑ์—”๋“œ๋ฅผ `torch.compile` ์™€ ํ•จ๊ป˜ ํ™œ์šฉํ•˜์—ฌ FP32 ๋ฐ์ดํ„ฐ ํƒ€์ž…์œผ๋กœ ๋ชจ๋ธ์„ ์ถ”๋ก ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์•„๋ณด์„ธ์š”.

.. code:: python
Expand All @@ -128,10 +127,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
model.eval()
data = torch.rand(1, 3, 224, 224)
#################### code changes ####################
#################### ์ฝ”๋“œ ๋ณ€๊ฒฝ ๋ถ€๋ถ„ ####################
import intel_extension_for_pytorch as ipex
# Invoke the following API optionally, to apply frontend optimizations
# ์„ ํƒ์ ์œผ๋กœ ๋‹ค์Œ API๋ฅผ ํ˜ธ์ถœํ•˜์—ฌ, ํ”„๋ก ํŠธ์—”๋“œ ์ตœ์ ํ™”๋ฅผ ์ ์šฉํ•ฉ๋‹ˆ๋‹ค.
model = ipex.optimize(model, weights_prepack=False)
compile_model = torch.compile(model, backend="ipex")
Expand All @@ -141,10 +139,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
compile_model(data)
Inference BF16
BF16 ์ถ”๋ก 
--------------

Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model inference with BFloat16 data type.
์•„๋ž˜ ์˜ˆ์‹œ๋ฅผ ํ†ตํ•ด `ipex` ๋ฐฑ์—”๋“œ๋ฅผ `torch.compile`์™€ ํ•จ๊ป˜ ํ™œ์šฉํ•˜์—ฌ BFloat16 ๋ฐ์ดํ„ฐ ํƒ€์ž…์œผ๋กœ ๋ชจ๋ธ์„ ์ถ”๋ก ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์•„๋ณด์„ธ์š”.

.. code:: python
Expand All @@ -155,10 +153,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c
model.eval()
data = torch.rand(1, 3, 224, 224)
#################### code changes ####################
#################### ์ฝ”๋“œ ๋ณ€๊ฒฝ ๋ถ€๋ถ„ ####################
import intel_extension_for_pytorch as ipex
# Invoke the following API optionally, to apply frontend optimizations
# ์„ ํƒ์ ์œผ๋กœ ๋‹ค์Œ API๋ฅผ ํ˜ธ์ถœํ•˜์—ฌ, ํ”„๋ก ํŠธ์—”๋“œ ์ตœ์ ํ™”๋ฅผ ์ ์šฉํ•ฉ๋‹ˆ๋‹ค.
model = ipex.optimize(model, dtype=torch.bfloat16, weights_prepack=False)
compile_model = torch.compile(model, backend="ipex")
Expand Down

0 comments on commit 10f1fab

Please sign in to comment.