-
Notifications
You must be signed in to change notification settings - Fork 743
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Pytorch] Fix StringSupplier and TensorIdGetter returning pointers to local variables #1391
Conversation
Maybe we'd better take and return |
@saudet Any comment or could you merge ? |
Looks OK I guess, but could you please fix these things in larger pull requests before merging them instead of doing a lot of mini pull requests like this? |
That's the inverse of what you prefer for javacpp repo. Do you want me to append the commit of PR #1392 in this PR ? |
No, I mean why did you not add that in pull #1360? If you just want to improve things further, yes, just start another mega pull request and modify everything you want in there, and I guess we won't merge it for a few months still before you stop "fixing" things? I mean, if you want to add new features and stuff, let's make multiple pull requests, if you just want to keep fixing things, fix them before merging the relevant pull request. |
Well, you did merge PR #1360. And the problems solved by this PR and PR 1392 were found afterward. |
No, I just mean try not to merge buggy code that you might want to fix later. If you don't mind that the code is buggy, I guess that's fine, but if you intend to fix it later, instead of opening many mini pull requests to fix everything later, try to not merge buggy code in the first place, that's all. |
Ok, but that were not the case. I didn't push a PR containing problems I was aware of and wanted to fix later. Did I ? |
I'm not saying you're doing it intentionally, I'm just saying you're creating a lot of unnecessary commits, so if you could stop doing that it would be nicer. |
I want to know how to make pytorch -javacpp train distribute on one more machines,data parallel ? and openmpi mpi we need to bring to javacpp,thanks @HGuillemet @saudet |
JNI code of function pointers returning a String end with
Without this
@Cast
, the function returns astd::basic_string<char>&
and the cast fromStringAdapter
returns a reference to a variable in the stack.The
@Cast
makes the functions return achar *
and theStringAdapter
casting operator now allocates a new string in the heap.Is there something more definitive to change in Generator about this ?