You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have created a costume dataset using Dynamic Graph Temporal Signal class, where I have varying edge indices and edge weights among snapshots.
My goal is to perform a binary classification on nodes. I constructed different models where I tried out different PGT layers, like GConvGRU, GConvLSTM and TGCN. My problem is that these layers give me outputs with all elements being NaNs.
Please note that my edge weights are positive values (ranging from 1 to some millions). I also tried log transformation of edge weights so there won't be too large values, but am still getting NaNs at the very first forward pass. I also tried setting the edge weights to be None, or some random tensors, but still get NaNs as outputs.
Below I provided one of my model architectures. the first layer (recurrent) is the one outputing NaNs.
I would really appreciate any comments or helps on this issue!
The text was updated successfully, but these errors were encountered:
Hello,
I have created a costume dataset using Dynamic Graph Temporal Signal class, where I have varying edge indices and edge weights among snapshots.
My goal is to perform a binary classification on nodes. I constructed different models where I tried out different PGT layers, like GConvGRU, GConvLSTM and TGCN. My problem is that these layers give me outputs with all elements being NaNs.
Please note that my edge weights are positive values (ranging from 1 to some millions). I also tried log transformation of edge weights so there won't be too large values, but am still getting NaNs at the very first forward pass. I also tried setting the edge weights to be None, or some random tensors, but still get NaNs as outputs.
Below I provided one of my model architectures. the first layer (recurrent) is the one outputing NaNs.
I would really appreciate any comments or helps on this issue!
The text was updated successfully, but these errors were encountered: