diff --git a/recipes_source/torch_compile_backend_ipex.rst b/recipes_source/torch_compile_backend_ipex.rst index 8d38a689..6b7b49b1 100644 --- a/recipes_source/torch_compile_backend_ipex.rst +++ b/recipes_source/torch_compile_backend_ipex.rst @@ -1,19 +1,20 @@ -Intel® Extension for PyTorch* Backend +Intel® Extension for PyTorch* 백엔드 ===================================== -To work better with `torch.compile`, Intel® Extension for PyTorch* implements a backend ``ipex``. -It targets to improve hardware resource usage efficiency on Intel platforms for better performance. -The `ipex` backend is implemented with further customizations designed in Intel® Extension for -PyTorch* for the model compilation. +**저자**: `Hamid Shojanazeri `_ +**번역:**: `김재현 `_ -Usage Example +- `torch.compile` 과 더 잘 작동하도록, Intel® Extension for PyTorch는 ``ipex`` 라는 백엔드를 구현했습니다. +- 이 백엔드는 Intel 플랫폼에서 하드웨어 자원 사용 효율성을 개선하여 성능을 향상시키는 것을 목표로 합니다. +- 모델 컴파일을 위한 Intel® Extension for PyTorch에 설계된 추가 커스터마이징을 통해, `ipex` 백엔드가 구현되었습니다. + +사용 예시 ~~~~~~~~~~~~~ -Train FP32 +FP32 학습 ---------- -Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model training with FP32 data type. - +아래 예제를 통해, 여러분은 FP32 데이터 타입으로 모델을 학습할 때 `torch.compile` 과 함께 `ipex` 백엔드를 사용하는 방법을 배울 수 있습니다. .. code:: python import torch @@ -44,10 +45,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c optimizer = torch.optim.SGD(model.parameters(), lr = LR, momentum=0.9) model.train() - #################### code changes #################### + #################### 코드 변경 부분 #################### import intel_extension_for_pytorch as ipex - - # Invoke the following API optionally, to apply frontend optimizations + # 선택적으로 다음 API를 호출하여, 프론트엔드 최적화를 적용합니다. model, optimizer = ipex.optimize(model, optimizer=optimizer) compile_model = torch.compile(model, backend="ipex") @@ -61,10 +61,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c optimizer.step() -Train BF16 +BF16 학습 ---------- -Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model training with BFloat16 data type. +아래 예시를 통해 BFloat16 데이터 타입으로 모델 학습을 위해 `torch.compile` 와 함께 `ipex` 백엔드를 활용하는 방법을 알아보세요. .. code:: python @@ -96,10 +96,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c optimizer = torch.optim.SGD(model.parameters(), lr = LR, momentum=0.9) model.train() - #################### code changes #################### + #################### 코드 변경 부분 #################### import intel_extension_for_pytorch as ipex - - # Invoke the following API optionally, to apply frontend optimizations + # 선택적으로 다음 API를 호출하여, 프론트엔드 최적화를 적용합니다. model, optimizer = ipex.optimize(model, dtype=torch.bfloat16, optimizer=optimizer) compile_model = torch.compile(model, backend="ipex") @@ -114,10 +113,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c optimizer.step() -Inference FP32 +FP32 추론 -------------- -Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model inference with FP32 data type. +아래 예시를 통해 `ipex` 백엔드를 `torch.compile` 와 함께 활용하여 FP32 데이터 타입으로 모델을 추론하는 방법을 알아보세요. .. code:: python @@ -128,10 +127,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c model.eval() data = torch.rand(1, 3, 224, 224) - #################### code changes #################### + #################### 코드 변경 부분 #################### import intel_extension_for_pytorch as ipex - - # Invoke the following API optionally, to apply frontend optimizations + # 선택적으로 다음 API를 호출하여, 프론트엔드 최적화를 적용합니다. model = ipex.optimize(model, weights_prepack=False) compile_model = torch.compile(model, backend="ipex") @@ -141,10 +139,10 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c compile_model(data) -Inference BF16 +BF16 추론 -------------- -Check the example below to learn how to utilize the `ipex` backend with `torch.compile` for model inference with BFloat16 data type. +아래 예시를 통해 `ipex` 백엔드를 `torch.compile`와 함께 활용하여 BFloat16 데이터 타입으로 모델을 추론하는 방법을 알아보세요. .. code:: python @@ -155,10 +153,9 @@ Check the example below to learn how to utilize the `ipex` backend with `torch.c model.eval() data = torch.rand(1, 3, 224, 224) - #################### code changes #################### + #################### 코드 변경 부분 #################### import intel_extension_for_pytorch as ipex - - # Invoke the following API optionally, to apply frontend optimizations + # 선택적으로 다음 API를 호출하여, 프론트엔드 최적화를 적용합니다. model = ipex.optimize(model, dtype=torch.bfloat16, weights_prepack=False) compile_model = torch.compile(model, backend="ipex")