PROJECT NOT UNDER ACTIVE MANAGEMENT
This project will no longer be maintained by Intel.
Intel has ceased development and contributions including, but not limited to, maintenance, bug fixes, new releases, or updates, to this project.
Intel no longer accepts patches to this project.
If you have an ongoing need to use this project, are interested in independently developing it, or would like to maintain patches for the open source software community, please create your own fork of this project.
Contact: [email protected]
The Intel Cloud Optimization Modules (ICOMs) for AWS are open-source codebases with codified Intel AI software optimizations and instructions built specifically for AWS. The modules are designed to enable AI developers to maximize the performance and productivity of industry-leading Python machine learning and deep learning libraries on Intel hardware. Each module or reference architecture includes a complete instruction set and all source code published on GitHub. You can check out the full suite of Intel Cloud Optimization Modules here.
Here are the currently released modules for AWS:
- Intel® Cloud Optimization Modules for AWS*: GPT2-Small Distributed Training: Fine-tune a Large Language Model (LLM) in a distributed setup on Intel Xeon CPUs.
- Intel® Cloud Optimization Modules for AWS*: XGBoost* on Kubernetes*: Build an accelerated Kubernetes cluster with Intel optimizations for XGBoost on an AWS computing cluster.
- Intel® Cloud Optimization Modules for AWS*: XGBoost* on SageMaker*: Build an accelerated model development SageMaker pipeline and Lambda Inference Endpoint with Intel Optimizations for XGBoost.