Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation for used metrics ? #412

Open
Mohamed-Dhouib opened this issue Nov 16, 2024 · 1 comment
Open

Documentation for used metrics ? #412

Mohamed-Dhouib opened this issue Nov 16, 2024 · 1 comment

Comments

@Mohamed-Dhouib
Copy link

Hello,

I was wondering if the metrics used are documented somewhere. For example, for VideoMME, the metric appears to be accuracy (which aligns with the official paper), but the name of the metric in the output (VIDEOMME_perception) is a bit confusing.

Thanks in advance!

@kcz358
Copy link
Collaborator

kcz358 commented Nov 23, 2024

So some of the pre-defined metrics (commonly used) are being registered in the api/metrics.py. Other tasks such as videomme has its own logic of calculating scores. You can find the scoring logic in their own task folder. Mostly in the utils.py. For example tasks/videomme/utils.py

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants