Skip to content

Commit

Permalink
doc: clarify the use case
Browse files Browse the repository at this point in the history
  • Loading branch information
meakbiyik committed Sep 25, 2023
1 parent 2a9193c commit bd9a25d
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 0 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@

Effortlessly cache PyTorch module outputs on-the-fly with `torchcache`.

Particularly useful for caching and serving the outputs of computationally expensive large, pre-trained PyTorch modules, such as vision transformers. Note that gradients will not flow through the cached outputs.

- [Features](#features)
- [Installation](#installation)
- [Basic usage](#basic-usage)
Expand Down
2 changes: 2 additions & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@ Welcome to torchcache!

`torchcache` offers an effortless way to cache PyTorch module outputs on-the-fly. By caching the outputs of a module, you can save time and resources when running the same pre-trained model on the same inputs multiple times.

Note that gradients will not flow through the cached outputs.

.. toctree::
:maxdepth: 2
:caption: Table of Contents:
Expand Down

0 comments on commit bd9a25d

Please sign in to comment.