You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+45-39Lines changed: 45 additions & 39 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,17 +9,23 @@
9
9
</h3>
10
10
<br>
11
11
12
+
# About
13
+
EquiAdapt is a [PyTorch](https://pytorch.org) package that provides a flexible and efficient way to make *any* neural network architecture (including large foundation models) equivariant, instead of redesigning and training from scratch. This is done by learning to canonicalize transformed inputs, before feeding them to the prediction model. You can play with this concept in the provided [tutorial](tutorials/images/instance_segmentation_group_equivariant_canonicalization.ipynb) for equivariant adaptation of the Segment-Anything Model (SAM, [Kirillov et. al, 2023](https://arxiv.org/abs/2304.02643)) and images from Microsoft COCO ([Lin et. al, 2014](https://arxiv.org/abs/1405.0312)) dataset for instance segmentation.
12
14
13
-
# Equivariant adaptation with canonicalization
15
+
To learn more about this from a blog, check out: [How to make your foundation model equivariant](https://mila.quebec/en/article/how-to-make-your-foundation-model-equivariant/)
16
+
17
+
## Equivariant adaptation with canonicalization
14
18

15
19
16
-

20
+
Read more about this [here](https://proceedings.mlr.press/v202/kaba23a.html)
17
21
18
-
EquiAdapt is a [PyTorch](https://pytorch.org) package that provides a flexible and efficient way to make *any* neural network architecture (including large foundation models) equivariant, instead of redesigning and training from scratch. This is done by learning to canonicalize transformed inputs, before feeding them to the prediction model.
22
+
## Prior regularized canonicalization
23
+

19
24
20
-
You can play with this concept in the provided [tutorial](tutorials/images/instance_segmentation_group_equivariant_canonicalization.ipynb) for equivariant adaptation of the Segment-Anything Model (SAM, [Kirillov et. al, 2023](https://arxiv.org/abs/2304.02643)) and images from Microsoft COCO ([Lin et. al, 2014](https://arxiv.org/abs/1405.0312)) dataset for instance segmentation.
25
+
Read more about this [here](https://proceedings.neurips.cc/paper_files/paper/2023/hash/9d5856318032ef3630cb580f4e24f823-Abstract-Conference.html)
21
26
22
-
# Easy to integrate :rocket:
27
+
# How to use?
28
+
## Easy to integrate :rocket:
23
29
24
30
Equiadapt enables users to obtain equivariant versions of existing neural networks with a few lines of code changes:
25
31
```diff
@@ -60,7 +66,7 @@ Equiadapt enables users to obtain equivariant versions of existing neural networ
60
66
optimizer.step()
61
67
```
62
68
63
-
# Details on using `equiadapt` library
69
+
##Details on using `equiadapt` library
64
70
65
71
1. Create a `canonicalization network` (or use our provided networks: for images, in `equiadapt/images/canonicalization_networks/`).
66
72
@@ -98,8 +104,20 @@ loss = canonicalizer.add_prior_regularizer(loss)
98
104
loss.backward()
99
105
```
100
106
101
-
# Setup instructions
102
-
### Setup Conda environment
107
+
# Installation
108
+
109
+
## Using pypi
110
+
You can install the latest [release](https://github.com/arnab39/equiadapt/releases) using:
111
+
112
+
```pip install equiadapt```
113
+
114
+
## Manual installation
115
+
116
+
You can clone this repository and manually install it with:
To create a conda environment with the necessary packages:
105
123
@@ -119,34 +137,19 @@ Note that this might not be a complete list of dependencies. If you encounter an
119
137
120
138
# Running equiadapt using example code
121
139
122
-
We provide example code to run equiadapt in different data domains and tasks to achieve equivariance. You can also find a [tutorial](tutorials/images/classification_group_equivariant_canonicalization.ipynb) on how to use equiadapt with minimalistic changes to your own code (for image classification).
123
-
124
-
Before you jump to the instructions for each of them please follow the setup hydra instructions to create a `.env` file with the paths to store all the data, wandb logs and checkpoints.
Our examples use `hydra` to configure hyperparameters. Follow the hydra setup instructions to create a .env file with the paths to store all the data, wandb logs and checkpoints.
151
+
152
+
Create a `.env` file in the root of the project with the following content:
You can also find [tutorials](tutorials) on how to use equiadapt with minimalistic changes to your own code.
162
+
158
163
159
-
# Related papers
164
+
165
+
166
+
167
+
# Related papers and Citations
160
168
161
169
For more insights on this library refer to our original paper on the idea: [Equivariance with Learned Canonicalization Function (ICML 2023)](https://proceedings.mlr.press/v202/kaba23a.html) and how to extend it to make any existing large pre-trained model equivariant: [Equivariant Adaptation of Large Pretrained Models (NeurIPS 2023)](https://proceedings.neurips.cc/paper_files/paper/2023/hash/9d5856318032ef3630cb580f4e24f823-Abstract-Conference.html).
162
170
163
-
To learn more about this from a blog, check out: [How to make your foundation model equivariant](https://mila.quebec/en/article/how-to-make-your-foundation-model-equivariant/)
164
171
165
-
# Citation
166
172
If you find this library or the associated papers useful, please cite the following papers:
Copy file name to clipboardExpand all lines: equiadapt/common/basecanonicalization.py
+20Lines changed: 20 additions & 0 deletions
Original file line number
Diff line number
Diff line change
@@ -1,3 +1,23 @@
1
+
"""
2
+
This module defines a base class for canonicalization and its subclasses for different types of canonicalization methods.
3
+
4
+
Canonicalization is a process that transforms the input data into a canonical (standard) form.
5
+
This can be cheap alternative to building equivariant models as it can be used to transform the input data into a canonical form and then use a standard model to make predictions.
6
+
Canonicalizarion allows you to use any existing arcitecture (even pre-trained ones) for your task without having to worry about equivariance.
7
+
8
+
The module contains the following classes:
9
+
10
+
- `BaseCanonicalization`: This is an abstract base class that defines the interface for all canonicalization methods.
11
+
12
+
- `IdentityCanonicalization`: This class represents an identity canonicalization method, which is a no-op; it doesn't change the input data.
13
+
14
+
- `DiscreteGroupCanonicalization`: This class represents a discrete group canonicalization method, which transforms the input data into a canonical form using a discrete group.
15
+
16
+
- `ContinuousGroupCanonicalization`: This class represents a continuous group canonicalization method, which transforms the input data into a canonical form using a continuous group.
17
+
18
+
Each class has methods to perform the canonicalization, invert it, and calculate the prior regularization loss and identity metric.
19
+
"""
20
+
1
21
fromtypingimportAny, Dict, List, Optional, Tuple, Union
0 commit comments