Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting-started with deel-lip #78

Open
wants to merge 19 commits into
base: master
Choose a base branch
from

Conversation

Sharing-Sam-Work
Copy link
Contributor

I have created two new tutorial notebooks specifically tailored for professionals in industrial sectors who want to swiftly grasp the practical applications of the package. These tutorials offer a concise introduction to utilizing the package for producing and training robust 1-Lipschitz deep learning models.

The focus here is on practical implementation rather than delving deeply into theoretical aspects. We provide practical suggestions to enhance user-friendliness and usability, making these tutorials ideal for those aiming for a practical working knowledge of the library and its functionalities, rather than an exhaustive theoretical exploration.

I have tested the changes with: tox -e py310-lint
I have also visualized the changes with: mkdocs serve

If the changes are validated, the google collab link will need to be changed in docs/index.md, so that they come from the deel-lip repository as opposed to mine (Sharing-Sam-Work), as in the below:
| Getting started 1 - Creating a 1-Lipschitz neural network | Open In Colab |

@cofri
Copy link
Collaborator

cofri commented Aug 9, 2023

Thanks @Sharing-Sam-Work for your contribution!
Some unit tests are failing but they are fixed in the pending PR #76. When #76 will be merged, just rebase your branch to make unit tests pass.

Copy link
Collaborator

@cofri cofri left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very good introductory notebooks! I believe it makes things clearer and smoother for inexperienced users. These tutorials enhance the main tools of deel-lip: the custom layers and the specific losses.
I suggested some improvements. Feel free to take them into account or not. We can discuss about them later on.

docs/notebooks/Getting_started_1.ipynb Outdated Show resolved Hide resolved
deel/lip/layers/convolutional.py Outdated Show resolved Hide resolved
docs/notebooks/Getting_started_1.ipynb Outdated Show resolved Hide resolved
docs/notebooks/Getting_started_1.ipynb Outdated Show resolved Hide resolved
docs/notebooks/Getting_started_1.ipynb Outdated Show resolved Hide resolved
docs/notebooks/Getting_started_2.ipynb Outdated Show resolved Hide resolved
docs/notebooks/Getting_started_2.ipynb Outdated Show resolved Hide resolved
Comment on lines 168 to 162
"We show two cases. In the first case, we use `deel-lip`'s `TauCategoricalCrossentropy` from the `losses` submodule. In the second case, we use another loss function from `deel-lip`: `MulticlassHKR`.\n",
"\n",
"In particular, we will show how these functions can be parametrized to increase the robustness of our predictive models. We will also see that generally, there is a compromise between the robustness and the accuracy of our models (i.e. better robustness generally comes at the price of a decrease in performance).\n",
"\n",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before talking about the two cases, I think we can emphasize the importance of the loss to train 1-Lipschitz networks, especially the fact that there is a trade-off between accuracy and robustness and that all our deel-lip losses provides hyper-parameters to tweak this trade-off. The user should understand here that training Lipschitz constrained networks requires to use our losses for a better control of the trade-off.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.
Changed the content:

🎮 Control over the accuracy-robustness trade-off with deel-lip's loss functions.

When training training 1-Lipschitz networks, one will see that there is a compromise between the robustness and the accuracy of the models. In simple terms, achieving stronger robustness often involves sacrificing some performance.

In this section, we will show the pivotal role of deel-lip's loss functions in training 1-Lipschitz networks. Each of these functions comes with its own set of hyperparameters, enabling you to precisely navigate and adjust the balance between accuracy and robustness.

We show two cases. In the first case, we use deel-lip's TauCategoricalCrossentropy from the losses submodule. In the second case, we use another loss function from deel-lip: MulticlassHKR.

docs/notebooks/Getting_started_2.ipynb Outdated Show resolved Hide resolved
docs/notebooks/Getting_started_2.ipynb Outdated Show resolved Hide resolved
@thib-s
Copy link
Member

thib-s commented Aug 21, 2023

Thank you @Sharing-Sam-Work for this great work 👍
In order to fully integrate this work we need to :

  • integrate @cofri comments
  • add the badges pointing to the final url of the notebook
  • rebase the branch and squash commits

My suggestion is to merge this PR in a temporary branch here in this repo so we can take care of this without requiring @Sharing-Sam-Work to do it.

Kierszbaum Samuel added 10 commits August 22, 2023 09:55
…rofessionals in industrial sectors who want to swiftly grasp the practical applications of the package. These tutorials offer a concise introduction to utilizing the package for producing and training robust 1-Lipschitz deep learning models. The focus here is on practical implementation rather than delving deeply into theoretical aspects. We provide practical suggestions to enhance user-friendliness and usability, making these tutorials ideal for those aiming for a practical working knowledge of the library and its functionalities, rather than an exhaustive theoretical exploration.
…class' k parameter in Getting-Started 1 and changed the hyper-parameter of the HKR loss function for Getting-Started 2)
@Sharing-Sam-Work
Copy link
Contributor Author

I have taken into account the comments.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants