Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tutorial notebooks: PerSAM #21

Open
NielsRogge opened this issue May 24, 2023 · 2 comments
Open

Tutorial notebooks: PerSAM #21

NielsRogge opened this issue May 24, 2023 · 2 comments
Labels
good first issue Good for newcomers

Comments

@NielsRogge
Copy link

Hi PerSAM authors :)

As your method is really cool I've contributed it to Hugging Face.

Here are 2 demo notebooks showcasing the PerSAM and PerSAM-f methods: https://github.com/NielsRogge/Transformers-Tutorials/tree/master/PerSAM.

Note that the Hugging Face implementation of SAM is used.

Cheers!

Btw there's already follow-up work: https://twitter.com/ducha_aiki/status/1660967979972960258

@ZrrSkywalker
Copy link
Owner

@NielsRogge Hi, thanks very much for your clear and valuable tutorials! Really appreciate your interest and contribution.
We have quoted your tutorials in the README file.

@ZrrSkywalker ZrrSkywalker added the good first issue Good for newcomers label May 24, 2023
@thomfoster
Copy link

Hey @ZrrSkywalker and @NielsRogge - thanks for your work in making this project more accessible.

I'm trying to understand how the target guided attention and target semantic prompting occur. Am I right in thinking this isn't being done in the notebooks Niels provided? And that to enable those features I would need to use the Sam Predictor from the original repo?

Best,
Thom

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants