Skip to content

catsandsoup32/E4E-Mangrove-Experimenting

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

E4E-Mangrove-Experimenting

11-20-2024

Made a smaller DenseNet U-Net and looked into LoRA for super-resolution.

  • U-Net/densenet_unet_3.py contains a model removing the fourth dense block and adding a modified first layer
  • 9.7 million vs. 15.9 million (original with ResNet) parameters

11-13-2024

Worked on exchanging the ResNet backbone in the U-Net for a DenseNet backbone.

  • DenseNet121 has four dense blocks, which easily fit into the existing four layers in the ResNet U-Net implementation
  • The caveat is that I couldn't find DenseNet weights trained on any sort of geo data, so this model may take more epochs to converge compared to ResNet
  • ResNet halves feature map size after every layer, but DenseNet layer 3 and 4 both are 8x8 (no transition after layer 4)
  • In U-Net/densenet_unet_1.py, I manually added a transition layer, but this brings the bottleneck size to 4x4, which may be too small for our pixel segmentation task
  • In U-Net/densenet_unet_2.py, the decode operation for center is not upsampled, which may reduce the effectiveness of the first skip connection

Further topics for classification:

  • Worth looking into better weight initialization (U-Net original paper section 3.0 emphasizes this)
  • Could try learning rate scheduler during training
  • Try a three block DenseNet

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages