Project | Description | Link |
---|---|---|
Learning Rates | Compares various gradient descent optimization methods such as Adam, Adagrad, RMSProp and examines their impact on training loss and training time. Also implements cyclical learning rate policy | https://github.com/tamanna-a/learning-rates |
Data Augmentation | Examines classic data augmentation and cutout augmentation, which randomly crops sections of images on CIFAR10 and Resnet-44 model | https://github.com/tamanna-a/classic-data-augmentation |
Activation functions | Unpeel the layers and examine weights under various activation functions (sigmoid, tanh, relu, leaky relu). I examine their gradients, trying different intitilizations, and leaky Relu in solving the vanishing gradient problem. | https://github.com/tamanna-a/relu-activation |
Neural net regularization | Batch normalization and drop out | https://github.com/tamanna-a/neuralnet-regularization |
Bias-Variance tradeoff | Bias-Variance tradeoff | https://github.com/tamanna-a/bias-variance |
-
Notifications
You must be signed in to change notification settings - Fork 0
tamanna-a/list-of-ml-projects
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published