-
Notifications
You must be signed in to change notification settings - Fork 105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Model initialization step within colab using tpu and default configuration exits with error. #24
Comments
I got it to work by changing the line to: |
Could you maybe share your version of the Colab notebook. I am still experiencing issues after changing the line to your suggestion... Thanks a lot for the help! :) |
Hey @corlangerak, did you remove the pip install hypernerf line in the collab notebook? Because changing the line locally wouldn't reflect the change(You'll be loading the hypernerf pip install instead).
|
@saunair Hi Nair, I faced the same issue. I tried to changed the line you suggested above though. Is it possible for you that giving me a little bit more specific instructions ? |
@hsauod check these two changes: ae29d1d#diff-433be35a4beb7eeee9224dcbe28ec97d53330cd175060905cd5217863674003cR114 and check the second cell in my notebook here: https://github.com/saunair/hypernerf/blob/main/notebooks/HyperNeRF_Training.ipynb @corlangerak here you go. Sorry about the delay |
@saunair Hi Nair, thank you very much for your kindly explanation |
Model initialization step within colab using tpu and default configuration exits with error.
Errors are nested through jax and hypernerf, but it appear that the root is
hypernerf/hypernerf/model_utils.py
Line 119 in d433ebe
within the
volumetric_rendering
function,jnp.broadcast_to([last_sample_z], z_vals[..., :1].shape)
.The relevant error is
A quick search brought up things like https://jax.readthedocs.io/en/latest/notebooks/Common_Gotchas_in_JAX.html#non-array-inputs-numpy-vs-jax which suggested all elements should be converted to the jnp arrays. Haven't gotten it working yet, though.
The text was updated successfully, but these errors were encountered: