-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Inference #18
Comments
Hi, Thank you for your interest in our work. I provide an inference script in the dev branch, named The usage is simple. Specify the path of the testing image folder. Provide the path of the model weights, you can use multiple model weights from different training folds for the model ensemble. Set the target_spacing to be the same with the training spacing. There is one step that you need to do manually, modify the normalization (clip, mean, std etc.) in the I'm developing some new features and more datasets support recently. The inference code will be merged into the main branch after I finish the new code. Feel free to let me know if you have any questions about the inference code. |
Hello @yhygao!! I'm trying to run the prediction.py , I have copied the preprocess part from training in the preprocess of the prediction.py as you suggest but I get an error saying "RuntimeError: Sizes of tensors must match except in dimension 1. Expected 170 but got size 169 for tensor number 2 in the list." Have you got any idea what it can be from? thank you in advance! |
Could you please provide more context for me to identify the problem? Like in which row of the script raises this error? This is typical because the size of the tensor is not correct, maybe because of interpolation. Or you can debug to see if the size of the tensor works as your expected. |
I've pushed a new update. The new prediction.py supports both 2D and 3D inference. |
Hello @yhygao,
Thank you for your work!
But I have a question about the inference!
I would like to know if there will be a code to run inference in the near future.
In addition, It would be really helpful to have documentation about how to run inference.
The text was updated successfully, but these errors were encountered: