Skip to content

Commit

Permalink
[dev => main] Release 0.1.2 (#8)
Browse files Browse the repository at this point in the history
PRs:
- #4 
- #6 
- #7 

Features:
- #3 

BugFix:
- #5
  • Loading branch information
dhkim0225 authored Jul 25, 2024
1 parent 78360d0 commit 7bc6b00
Show file tree
Hide file tree
Showing 8 changed files with 66 additions and 11 deletions.
23 changes: 16 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
<h3> <p align="center"> 🎉🎉 Our paper has been accepted at ECCV 2024! Stay tuned for more updates here !! 🎉🎉</h3>

<div align="center">
<h3> 🎉🎉 Our paper has been accepted at ECCV 2024! Stay tuned for more updates !! 🎉🎉 </h3>

<h2><a href="https://arxiv.org/abs/2403.19588">DenseNets Reloaded: Paradigm Shift Beyond ResNets and ViTs</a></h2>

[Donghyun Kim](https://scholar.google.co.kr/citations?hl=en&user=EBC8BMAAAAAJ)<sup>1*</sup>, [Byeongho Heo](https://sites.google.com/view/byeongho-heo/home)<sup>2</sup>, [Dongyoon Han](https://dongyoonhan.github.io/)<sup>2*</sup>
Expand All @@ -16,11 +16,10 @@
<a href="https://huggingface.co/naver-ai" alt="Huggingface">
<img src="https://img.shields.io/badge/huggingface-NAVERAILab-F58336" /></a>
</p>

</p>

<p align="center">
<img src="./resources/images/rdnet_reloaded.gif" alt="Densenet Reloaded">
<img src="./resources/images/rdnet_reloaded.gif" alt="Densenet Reloaded" width="46.5%" height="100%">
<img src="./resources/images/densenet_becomes_rdnet.gif" alt="Densenet becomes RDNet" width="51%" height="100%">
</p>

We revitalize **Densely Connected Convolutional Networks (DenseNets)** and reveal their untapped potential to challenge the prevalent dominance of ResNet-style architectures. Our research indicates that DenseNets were previously underestimated, primarily due to conventional design choices and training methods that underexploited their full capabilities.
Expand All @@ -45,17 +44,27 @@ Our work aims to reignite interest in DenseNets by demonstrating their renewed r

*We believe that various architectural designs that have been popular recently would be combined with dense connections successfully.*

## Easy to use
Install rdnet package with `pip install rdnet` !

```python
import timm
import rdnet # this will register the RDNet models to timm

model = timm.create_model('rdnet_large', pretrained=True)
```

For detailed usage, please refer to the [huggingface model card](https://huggingface.co/naver-ai/rdnet_tiny.nv_in1k).

## Updates
- **(2024.07.24)**: Pip installable pacakge added.
- **(2024.04.19)**: Initial release of the repository.
- **(2024.03.28)**: Paper is available on [arXiv](https://arxiv.org/abs/2403.19588).

## Coming Soon
- [ ] More ImageNet-22k Pretrained Models.
- [ ] More ImageNet-1k fine-tuned models.
- [x] Cascade Mask R-CNN with RDNet.
- [ ] DETR with RDNet.
- [ ] Self-Supervised Learning with RDNet.
- [ ] Transfer Learning with RDNet (with cifar10, cifar100, stanford cars, ...).

## RDNet for Image Classification
Expand Down
8 changes: 7 additions & 1 deletion detection/backbone/rdnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,10 @@ def __init__(
self.gamma = nn.Parameter(ls_init_value * torch.ones(growth_rate)) if ls_init_value > 0 else None
growth_rate = int(growth_rate)
inter_chs = int(num_input_features * bottleneck_width_ratio / 8) * 8
self.drop_path = DropPath(drop_path_rate)

if self.drop_path_rate > 0:
self.drop_path = DropPath(drop_path_rate)

self.layers = eval(block_type)(
in_chs=num_input_features,
inter_chs=inter_chs,
Expand All @@ -130,6 +133,9 @@ def forward(self, x):

if self.gamma is not None:
x = x.mul(self.gamma.reshape(1, -1, 1, 1))

if self.drop_path_rate > 0 and self.training:
x = self.drop_path(x)
return x


Expand Down
28 changes: 28 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
[build-system]
requires = [
"setuptools>=42",
"wheel"
]
build-backend = "setuptools.build_meta"

[project]
name = "rdnet"
authors = [
{name = "Donghyun Kim", email = "[email protected]"},
]
description = "RDNet (Densenets Reloaded: Paradigm shift beyond ResNets and ViTs)"
readme = "README.md"
requires-python = ">=3.8"
keywords = ["backbone", "classification", "densenet", "convolutional-neural-networks", "revisit", "dense-connections", "rdnet"]
license = {file = "LICENSE"}
classifiers = [
"Programming Language :: Python :: 3",
]
dependencies = [
"torch>=1.9.1",
"timm~=0.9"
]
version = "0.1.2"

[tool.setuptools.packages.find]
include = ["rdnet", "rdnet.*"]
1 change: 1 addition & 0 deletions rdnet/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from .rdnet import RDNet
8 changes: 7 additions & 1 deletion rdnet.py → rdnet/rdnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,10 @@ def __init__(
self.gamma = nn.Parameter(ls_init_value * torch.ones(growth_rate)) if ls_init_value > 0 else None
growth_rate = int(growth_rate)
inter_chs = int(num_input_features * bottleneck_width_ratio / 8) * 8
self.drop_path = DropPath(drop_path_rate)

if self.drop_path_rate > 0:
self.drop_path = DropPath(drop_path_rate)

self.layers = eval(block_type)(
in_chs=num_input_features,
inter_chs=inter_chs,
Expand All @@ -130,6 +133,9 @@ def forward(self, x):

if self.gamma is not None:
x = x.mul(self.gamma.reshape(1, -1, 1, 1))

if self.drop_path_rate > 0 and self.training:
x = self.drop_path(x)
return x


Expand Down
Binary file added resources/images/densenet_becomes_rdnet.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified resources/images/rdnet_reloaded.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
9 changes: 7 additions & 2 deletions segmentation/backbone/rdnet.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,10 @@ def __init__(
self.gamma = nn.Parameter(ls_init_value * torch.ones(growth_rate)) if ls_init_value > 0 else None
growth_rate = int(growth_rate)
inter_chs = int(num_input_features * bottleneck_width_ratio / 8) * 8
self.drop_path = DropPath(drop_path_rate)

if self.drop_path_rate > 0:
self.drop_path = DropPath(drop_path_rate)

self.layers = eval(block_type)(
in_chs=num_input_features,
inter_chs=inter_chs,
Expand All @@ -130,8 +133,10 @@ def forward(self, x):

if self.gamma is not None:
x = x.mul(self.gamma.reshape(1, -1, 1, 1))
return x

if self.drop_path_rate > 0 and self.training:
x = self.drop_path(x)
return x

class DenseStage(nn.Sequential):
def __init__(self, num_block, num_input_features, drop_path_rates, growth_rate, **kwargs):
Expand Down

0 comments on commit 7bc6b00

Please sign in to comment.