Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can I compute the L2 local Lipschitz constant with the framework? #47

Open
kwmaeng91 opened this issue May 7, 2023 · 4 comments
Open

Comments

@kwmaeng91
Copy link

Hi, I am curious if the framework supports calculating the L2 local Lipschitz constant (bound of the L2-norm of the Jacobian).
The NeurIPS 2022 paper seems like it is suggesting that only L-inf norm is supported. However, under examples/vision/jacobian.py, there are three examples, and I was not sure if one of them are calculating L2 local Lipschitz constant.

Specifically, I wasn't sure what the first example is calculating, without the norm=np.inf argument here:

# Example 1: Convert the model for Jacobian bound computation

Can you explain the difference between the first and the second example in the above link, and comment on whether there is a way for me to bound the L2-norm of the Jacobian with the framework in general?

Thank you for the help!

@shizhouxing
Copy link
Member

Hi @kwmaeng91 ,

There is no example of computing the L2 local Lipschitz constant in the current code.
In the first example, we just bound the Jacobian matrix without taking any norm and the result is not a Lipschitz constant. The second example computes the Linf local Lipschitz constant.

The framework is intended for Linf norm only for now. For L2 norm which involves spectral norms, the current LiRPA framework seems to be produce loose bounds.

@kwmaeng91
Copy link
Author

Hi, @shizhouxing, thank you for the reply!

Is there any other framework you would recommend for L2 norm? It seems like your team have published a stream of works and codebases related to this problem. If you have any other project you think would be more related I would love to know (e.g., will RecurJac or Fast-Lip be helpful?)
If not, I want to know if it is possible to calculate the L2 norm, even if it is loose. If it is still reasonable, I would like to learn how it can be possible.

Thank you.

@shizhouxing
Copy link
Member

I previously tried computing L2 norm also and the code could produce some L2 results at that time. I need to check the current code and get back to you.

RecurJac and Fast-Lip are special cases of this new framework so I don't think they can produce tight results. I know LipSDP ([13] cited in the NeurIPS 2022 paper) can compute L2 results though it may be limited to small models due to its high cost.

@kwmaeng91
Copy link
Author

Thanks @shizhouxing!! I will be awaiting your response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants