Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: Calibrate testing - Process erroring out #4345

Open
ngraham76 opened this issue Aug 1, 2024 · 1 comment
Open

[BUG]: Calibrate testing - Process erroring out #4345

ngraham76 opened this issue Aug 1, 2024 · 1 comment
Assignees
Labels
bug Something isn't working Q&A Quality Assurance

Comments

@ngraham76
Copy link
Contributor

Describe the bug
When going through the test - at step 3.4 (running Calibrate), instead of displaying a loss chart while processing and calibrated output results, error messages appeared. See screenshot below:
image

Link to the test:
https://app.staging.terarium.ai/projects/e97fac66-3bab-4e48-8f8f-715e32eff43e/workflow/a57405c5-e3ea-4366-985e-73c186d26925?operator=ab7b727c-b46c-4d13-893e-fb914ccbedd7

@ngraham76 ngraham76 added bug Something isn't working Q&A Quality Assurance labels Aug 1, 2024
@mwdchang
Copy link
Member

mwdchang commented Aug 1, 2024

The data in the workflow is strange and doesn't make sense

  • The model configurations have inferredParameterList filled in, that is not possible if they are "default configurations"
  • The model in question has 3 model-configs, all of them have "inferredParameterList", this is also impossible because we need at least one config without in order to derive another config that has inferrredParameterList...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working Q&A Quality Assurance
Projects
None yet
Development

No branches or pull requests

2 participants