-
Notifications
You must be signed in to change notification settings - Fork 37
(do not merge!) attempt to debug fastldf benchmarks #1144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
ah thwarted by the julia registry, i think that's my signal to not do work for today |
Benchmark Report
Computer InformationBenchmark Results |
a348802 to
1f53fe4
Compare
|
i mean those results look way more sensible so not sure what i did wrong on the other PRs |
d634729 to
10d647e
Compare
ChairmarksBenchmarkTools |
10d647e to
01634c5
Compare
|
But looking at the numbers, it looks like benchmarking on CI is just a lottery to begin with, doesn't matter which package you're using. This is especially so considering that the same model, with the same link status, on this PR should all have the same evaluation times (fastLDF removes the VarInfo so there's no difference between whichever varinfo it was constructed with). Yet both packages have weird variations in runtimes for evaluation that don't follow any particular pattern. |
252099b to
6ec2bbd
Compare
c56ceda to
76e5f9f
Compare
|
NGL the fact that I have to wait 4 minutes for the whole thing to precompile again just so that it can read in json files and pretty print them is not exactly the best advertisement for Julia. |
76e5f9f to
14997d5
Compare
No description provided.