Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it possible to run inference in batch mode #61

Open
pumetu opened this issue Sep 7, 2023 · 4 comments
Open

Is it possible to run inference in batch mode #61

pumetu opened this issue Sep 7, 2023 · 4 comments

Comments

@pumetu
Copy link

pumetu commented Sep 7, 2023

Hello,
I was wondering if it would be possible if lightglue could be ran in batch mode, since I have to match a large amount of images the bottleneck right now is inference time.
Thank you for your work!

@sarlinpe
Copy link
Member

Yes but you'd need to:

  • disable adaptive depth and width mechanisms.
  • make sure that the detector finds a fixed number of keypoints in all images, using max_num_keypoints=1024, detection_threshold=0.0

@mmmmmjaai
Copy link

hi can you share you code in batch mode with me? iam beginer, i got some trouble with it, thank you

@kkamalrajk
Copy link

@sarlinpe Could you please help me with the batch processing code ?

@nguyenhoangvudtm23
Copy link

nguyenhoangvudtm23 commented Jul 12, 2024

@sarlinpe what do you think about the performance if I use zero-pad for keypoint and disable point_pruning? May it be significantly reduced?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants