Skip to content

Commit

Permalink
docs: update readme and enable autogen
Browse files Browse the repository at this point in the history
  • Loading branch information
meakbiyik committed Aug 29, 2023
1 parent f3afd11 commit dd37ef3
Show file tree
Hide file tree
Showing 6 changed files with 53 additions and 38 deletions.
9 changes: 9 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,11 +1,19 @@
default_language_version:
python: python3.10
repos:
- repo: local
hooks:
- id: generate-sphinx-docs
name: Generate Sphinx docs
entry: poetry run sphinx-build -b html docs/source docs/build
language: system
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v3.2.0
hooks:
- id: trailing-whitespace
exclude: ^docs/source/_autosummary/.*
- id: end-of-file-fixer
exclude: ^docs/source/_autosummary/.*
- id: check-yaml
- id: check-toml
- id: check-added-large-files
Expand All @@ -14,6 +22,7 @@ repos:
- id: check-merge-conflict
- id: mixed-line-ending
args: ['--fix=lf']
exclude: ^docs/source/_autosummary/.*
- repo: https://github.com/psf/black
rev: 23.1.0
hooks:
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# torchcache

[![Lint and Test](https://github.com/meakbiyik/torchcache/actions/workflows/ci.yaml/badge.svg?branch=main)](https://github.com/meakbiyik/torchcache/actions/workflows/ci.yaml) [![codecov](https://codecov.io/gh/meakbiyik/torchcache/graph/badge.svg?token=Oh6mNp0pc8)](https://codecov.io/gh/meakbiyik/torchcache)
[![Lint and Test](https://github.com/meakbiyik/torchcache/actions/workflows/ci.yaml/badge.svg?branch=main)](https://github.com/meakbiyik/torchcache/actions/workflows/ci.yaml) [![Codecov](https://codecov.io/gh/meakbiyik/torchcache/graph/badge.svg?token=Oh6mNp0pc8)](https://codecov.io/gh/meakbiyik/torchcache) [![Documentation Status](https://readthedocs.org/projects/torchcache/badge/?version=latest)](https://torchcache.readthedocs.io/en/latest/?badge=latest)

Effortlessly cache PyTorch module outputs on-the-fly with `torchcache`.

The documentation will be available soon at [torchcache.readthedocs.io](https://torchcache.readthedocs.io/en/latest/).
The documentation is available [torchcache.readthedocs.io](https://torchcache.readthedocs.io/en/latest/).

- [Features](#features)
- [Installation](#installation)
Expand Down Expand Up @@ -103,7 +103,7 @@ torchcache automatically manages the cache by hashing both:
1. The decorated module (including its source code obtained through `inspect.getsource`) and its args/kwargs.
2. The inputs provided to the module's forward method.

This hash serves as the cache key for the forward method's output per item in a batch. When our MRU (most-recently-used) cache fills up for the given session, the system continues running the forward method and dismisses the oldest output. This MRU strategy streamlines cache invalidation, aligning with the iterative nature of neural network training, without requiring any auxiliary record-keeping.
This hash serves as the cache key for the forward method's output per item in a batch. When our MRU (most-recently-used) cache fills up for the given session, the system continues running the forward method and dismisses the newest output. This MRU strategy streamlines cache invalidation, aligning with the iterative nature of neural network training, without requiring any auxiliary record-keeping.

> :warning: **Warning**: To avoid having to calculate the directory size on every forward pass, `torchcache` measures and limits the size of the persistent data created only for the given session. To prevent the persistent cache from growing indefinitely, you should periodically clear the cache directory. Note that if you let `torchcache` create a temporary directory, it will be automatically deleted when the session ends.
Expand Down
12 changes: 6 additions & 6 deletions docs/source/_autosummary/torchcache.set_logger_config.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
torchcache.set\_logger\_config
==============================

.. currentmodule:: torchcache

.. autofunction:: set_logger_config
torchcache.set\_logger\_config
==============================

.. currentmodule:: torchcache

.. autofunction:: set_logger_config
48 changes: 27 additions & 21 deletions docs/source/_autosummary/torchcache.torchcache._TorchCache.rst
Original file line number Diff line number Diff line change
@@ -1,21 +1,27 @@
torchcache.torchcache.\_TorchCache
==================================

.. currentmodule:: torchcache.torchcache

.. autoclass:: _TorchCache


.. automethod:: __init__


.. rubric:: Methods

.. autosummary::

~_TorchCache.__init__
~_TorchCache.cache_cleanup
~_TorchCache.forward_hook
~_TorchCache.forward_pre_hook
~_TorchCache.hash_tensor
~_TorchCache.wrap_module
torchcache.torchcache.\_TorchCache
==================================

.. currentmodule:: torchcache.torchcache

.. autoclass:: _TorchCache


.. automethod:: __init__


.. rubric:: Methods

.. autosummary::

~_TorchCache.__init__
~_TorchCache.cache_cleanup
~_TorchCache.forward_hook
~_TorchCache.forward_pre_hook
~_TorchCache.hash_tensor
~_TorchCache.wrap_module






12 changes: 6 additions & 6 deletions docs/source/_autosummary/torchcache.torchcache.rst
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
torchcache.torchcache
=====================

.. currentmodule:: torchcache

.. autofunction:: torchcache
torchcache.torchcache
=====================

.. currentmodule:: torchcache

.. autofunction:: torchcache
4 changes: 2 additions & 2 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
.. torchcache documentation master file
Welcome to torchcache's documentation!
======================================
Welcome to torchcache!
======================

`torchcache` offers an effortless way to cache PyTorch module outputs on-the-fly. By caching the outputs of a module, you can save time and resources when running the same pre-trained model on the same inputs multiple times.

Expand Down

0 comments on commit dd37ef3

Please sign in to comment.