-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
improve validation process #127
Comments
We have a very similar approach in NWB inspector: BEST PRACTICE SUGGESTIONS: suggested improvements that would be nice Since we are already so close, it might be nice to converge on a system that can be used for both. |
thoughts, @CodyCBakerPhD? |
Indeed, this is very similar to our approach, which Ben highlights nicely. It's worked well for us so far in providing a global perspective of all the things that could be improved in any given set of NWBFiles, but in order of 'what needs to get fixed ASAP' as opposed to 'what would be nice to have'. I'd just like to point out two things for our side:
CRITICAL: 4 checks And even as we finish setting up the remaining checks on the inspector, this pattern is pretty consistent in that most new ones we add are
depending on the field in question, you might not be 100% sure that a certain field is relevant to that experiment, so having an intermediate 'ignorable' level is just as important as having an outright 'blockable' one. |
|
I'm +1'ing this from the archive side. We've run into places where we're unable to distinguish between a user input error and genuine bugs. I'm indifferent to the approach but a common pattern I've seen is creating a base exception class e.g. |
current json and pydantic validation only raises exceptions. we may also want to consider levels of validation, e.g.,
ALL
,WARN
,CRITICAL
ALL
: "would be nice if all these metadata were provided"WARN
: "we don't know if this is applicable to you, but it was not provided"CRITICAL
: "missing metadata - dandiset cannot be published without these"level names are suggestions at this point.
once this is implemented:
this would also be tied to automigration of any draft dandiset/asset metadata. the validator should be able to indicate to the user if automigration would solve any validation issues or whether human intervention (reupload, reconvert, edit metadata in gui, etc.) is required.
The text was updated successfully, but these errors were encountered: