description |
---|
Resources for teaching machine learning and deep learning |
W&B is a great tool for learning and collaboration. We offer free academic accounts, and we've collected some good resources for helping you and your students navigate complex machine learning projects.
Academic Teams
Request an academic team to get started having your students submit results to a shared workspace. Grading projects is easy when you can see the real, tracked results and the code that students used to generate models.
Reports
Ask your students to submit reports so you can explore their results and compare new projects against previous baselines. Reports make it easy to describe intermediate results and show progress, and all the graphs are connected to real model results that you can reproduce. View an example report →
Competitions
Create a project in your academic team, and have your students compete to achieve the best accuracy on a shared task. Here's a screenshot of an example competition. Each row is a different experiment, and users are competing for the highest accuracy. View the project →
We've built up a set of working examples of deep learning projects in different frameworks.
We also created some hosted notebooks with models that your students can start training for free with the click of a button.
- PyTorch introduction with screenshots
- Keras MNIST example
- TensorFlow 2 convolutional neural network
Lukas and Chris have built a library of short tutorial projects with notes on each class section. Reach out to [email protected] with questions.
Here are some excellent videos, notes, and slides from around the web. These are great additions to any curriculum on machine learning.
- Introduction to Machine Learning for Coders: Great fastai course with videos, example code, and a vibrant forum for support
- Full Stack Deep Learning: Taught by our friend Josh Tobin, this awesome course will take you to the next level of expertise in building deep learning models
- Stanford CS230 Deep Learning: Lectures and slides are available online for this awesome Stanford course taught by Andrew Ng.
- MIT Intro to Deep Learning: Accessible introduction taught by Alexander Amini and Ava Soleimany
- Troubleshooting Deep Neural Networks: Josh Tobin's excellent slide deck on debugging models
- Sequence to sequence models: Slides from Stanford's computational linguistics class
- Transfer and multi-task learning: Sergey Levine from Berkeley
- Transfomer models: Richard Socher from Stanford
- Keras intro to seq2seq: A fast intro from the Keras team
- Original paper: Ilya Sutskever and colleagues from Google
- Berkeley slides: Encoder-decoder, seq2seq, and machine translation
- OpenAI GPT-2: a model for generating realistic text
- TalkToTransformer.com: try GPT 2
- GLUE Benchmark: resources for training and analyzing natural language systems
- SuperGLUE: updated and improved v2 of the GLUE benchmark
- Livox: uses NLP in an alternative communication app
- Practical Twitter Content Mining: medical journal article about using NLP on tweets
- Applications of NLP: Medium article talking about 10 interesting applications
- Zero-shot transfer learning + LSTMs: Enhancing translation technology for minimally verbal individuals
If you are teaching a class, we would love to support you. Please reach out to us at [email protected].