Skip to content

tamanna-a/list-of-ml-projects

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 

Repository files navigation

List of projects examining Deep Learning Hyperparameters

Project Description Link
Learning Rates Compares various gradient descent optimization methods such as Adam, Adagrad, RMSProp and examines their impact on training loss and training time. Also implements cyclical learning rate policy https://github.com/tamanna-a/learning-rates
Data Augmentation Examines classic data augmentation and cutout augmentation, which randomly crops sections of images on CIFAR10 and Resnet-44 model https://github.com/tamanna-a/classic-data-augmentation
Activation functions Unpeel the layers and examine weights under various activation functions (sigmoid, tanh, relu, leaky relu). I examine their gradients, trying different intitilizations, and leaky Relu in solving the vanishing gradient problem. https://github.com/tamanna-a/relu-activation
Neural net regularization Batch normalization and drop out https://github.com/tamanna-a/neuralnet-regularization
Bias-Variance tradeoff Bias-Variance tradeoff https://github.com/tamanna-a/bias-variance

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published