Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support multi-session quality_control by tracking metric or evaluation provenance #1090

Open
dbirman opened this issue Oct 3, 2024 · 1 comment
Assignees

Comments

@dbirman
Copy link
Member

dbirman commented Oct 3, 2024

Users have requested that we find a way to track multi-session quality control in the main schema, rather than some kind of extension.

Short explanation for why this is needed:

  • A user has ten assets that all do their own raw/processing qc and pass and then they need to compare them somehow, e.g. matching ophys fov
  • They run a multi-session capsule that generates evaluations/metrics to check whether the fov is the same. This is a blocking step because the annotation is manual
  • Someone annotates the metrics and marks that one asset has a different fov, it needs to be dropped for further multi-session analysis
  • A second multi-session is now run that needs to pull only the assets with the same fov. To match metrics -> assets the metrics need to track which asset they were generated from
@dbirman dbirman self-assigned this Oct 3, 2024
@dbirman
Copy link
Member Author

dbirman commented Oct 3, 2024

Notes from discussion on 10/3:

  • There's no easy ability to generate isolated data assets the way I had hoped for, so dumping the quality_control.json isn't that simple, people are going to have to use the data-access-api to do this, with some limited permissions
  • We need to stratify QC status based on local vs global scope and include whether evaluations have multiple upstream dependencies
  • We need to mark evaluations (or metrics?) as having external dependencies, and possibly list their dependency structure using the "name" fields

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant