Skip to content

Latest commit

 

History

History
12 lines (8 loc) · 1.11 KB

README.md

File metadata and controls

12 lines (8 loc) · 1.11 KB

List of projects examining Deep Learning Hyperparameters

Project Description Link
Learning Rates Compares various gradient descent optimization methods such as Adam, Adagrad, RMSProp and examines their impact on training loss and training time. Also implements cyclical learning rate policy https://github.com/tamanna-a/learning-rates
Data Augmentation Examines classic data augmentation and cutout augmentation, which randomly crops sections of images on CIFAR10 and Resnet-44 model https://github.com/tamanna-a/classic-data-augmentation
Activation functions Unpeel the layers and examine weights under various activation functions (sigmoid, tanh, relu, leaky relu). I examine their gradients, trying different intitilizations, and leaky Relu in solving the vanishing gradient problem. https://github.com/tamanna-a/relu-activation
Neural net regularization Batch normalization and drop out https://github.com/tamanna-a/neuralnet-regularization
Bias-Variance tradeoff Bias-Variance tradeoff https://github.com/tamanna-a/bias-variance