You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I run the pplm when both bow and discrim are on, ('technology', 'sentiment', respectively), new_accumulated_hidden.shape[1] = 765 but the emb_size in mlp is 1024, the dimensions are not consistent in matmul in pplm_classification_head, so I am getting
RuntimeError: size mismatch, m1: [1 x 768], m2: [1024 x 5] when calculating the loss for the pertubed text.
Please correct me if I miss something, thank you very much for your help
The text was updated successfully, but these errors were encountered:
When I run the pplm when both bow and discrim are on, ('technology', 'sentiment', respectively), new_accumulated_hidden.shape[1] = 765 but the emb_size in mlp is 1024, the dimensions are not consistent in matmul in pplm_classification_head, so I am getting
RuntimeError: size mismatch, m1: [1 x 768], m2: [1024 x 5] when calculating the loss for the pertubed text.
Please correct me if I miss something, thank you very much for your help
The text was updated successfully, but these errors were encountered: