Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Model Classifies all brown or dark people as spoof #26

Open
imr555 opened this issue Oct 18, 2021 · 1 comment
Open

Model Classifies all brown or dark people as spoof #26

imr555 opened this issue Oct 18, 2021 · 1 comment

Comments

@imr555
Copy link

imr555 commented Oct 18, 2021

I tested your model in real life against real and spoofed photos and videos of me and some of my friends.

The amount of false negatives for real(predicting spoof where it should predict real) was quite astonishing.

This might be because CelebA-Spoof dataset is biased towards white people in general or the model is a bit weak to focus on actual cues. A pattern based model might fair better against people of other ethnicity

@imr555 imr555 changed the title Model Classifies all brown or dark people as spoof. Model Classifies all brown or dark people as spoof Oct 18, 2021
@KananVyas
Copy link

is there any spoofed dataset for Indian/Brown Faces?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants