-
Notifications
You must be signed in to change notification settings - Fork 235
feat: implement_ansi_eval_mode_arithmetic #2136
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
feat: implement_ansi_eval_mode_arithmetic #2136
Conversation
Added test cases for addition , subtraction and multiplication use cases. Given that spark inherently converts decimal operands to doubles (and decimals in case of integral division in comet) , I am yet to make code changes needed to test out those use cases |
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2136 +/- ##
============================================
+ Coverage 56.12% 57.65% +1.52%
- Complexity 976 1297 +321
============================================
Files 119 147 +28
Lines 11743 13487 +1744
Branches 2251 2386 +135
============================================
+ Hits 6591 7776 +1185
- Misses 4012 4446 +434
- Partials 1140 1265 +125 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
@andygrove , @kazuyukitanimura
I am inclined towards #1 given the effort to reward ratio |
@coderfender I agree. Do you plan to change |
After some debugging early exception raising (from datafusion custom ANSI kernel) seems to be the issue . Below is the exact test which is causing a failure in Spark4.0
|
@kazuyukitanimura , @andygrove I updated the exception / message to be exactly matching with Spark . However, it seems like the issue is with
Would love to know your thoughts and also catch up if I am missing something. Code to reproduce above error :
|
Investigating into new test failures |
@mbutrovich I seem to recall that you had a solution for this kind of test failure |
@parthchandra , thank you for the review. Failing test logs : https://github.com/apache/datafusion-comet/actions/runs/17112443091/job/48536768203?pr=2136 Thank you |
922d7fc
to
1022595
Compare
I was able to comb through the failed tests and fixed the failing int check case match statement |
Summary of changes Native Side :
Scala side:
cc : @andygrove , @kazuyukitanimura |
To clarify this a bit more, Spark performs lazy evaluation on the inputs to |
I filed #2231 to track this on Comet side |
d3ee468
to
a740ee3
Compare
Rebased with main branch after lazy evaluation changes (Coalesce) are merged |
4d825d8
to
58a1bee
Compare
Resolved issues with failing tests caused by incorrect diff file generation . |
da41f9d
to
b64eb6d
Compare
Which issue does this PR close?
Support ANSI mode for arithmetic operations in Spark #2137
Closes #2137.
Rationale for this change
Introduce ANSI support for arithmetic operations (for integer/ float types).
Now that try eval mode is supported with comet,this PR essentially tracks changes to take it a step further and support ANSI mode.
In order to achieve that, following changes are made :
QueryPlanSerde
is now updated to not fallback to Spark when EVAL mode is ANSI (for arithmetic operations only)planner.rs
is now going to respect EVAL mode when creating a binary arithmetic functionchecked_arithmetic.rs
now raises a spark equivalent exception on overflow / division by 0 if ANSI mode is enabledWhat changes are included in this PR?
How are these changes tested?
-->