Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Integer literal in imagine block inferred as decimal type #28

Open
adam-c-anderson opened this issue Oct 19, 2017 · 2 comments
Open

Integer literal in imagine block inferred as decimal type #28

adam-c-anderson opened this issue Oct 19, 2017 · 2 comments

Comments

@adam-c-anderson
Copy link

adam-c-anderson commented Oct 19, 2017

This vendor-specific code

    vendor tsql {
        select *, checksum(*) as csum from person
    } imagine {
        select *, 0 as csum from person
    }

infers the type of the csum column as decimal type. Changing the column expression to cast(0 as int) changes the type to int.

From the documentation I expect digits with no decimal point to be interpreted as an int literal.

@rspeele
Copy link
Collaborator

rspeele commented Oct 20, 2017

This is working as designed but I can see how it would be good for the design to change.

All numeric literals regardless of format (even hex) are inferred as <numeric>.
This placeholder within the type hierarchy allows them to be unified with any other numeric type. The way it currently works, if we inferred 0 as <integral> then it wouldn't unify with any types under the <fractional> branch of the hierarchy, so e.g. select SomeFloatingPointColumn + 1 wouldn't work.

When a type comes out of inference as one of those placeholders like <numeric> or <fractional> or whatever, instead of a concrete type like int64, we just default to the most general type. In the case of <numeric> decimal is considered to be the most general.

Maybe we could track a little extra information in the type so that literals like 0 can still be treated as "any ol' number" for unification purposes, but if they get through all of typechecking without further info, could get a default type assigned based on the characteristics of the literal (fits in int32? has decimal point? hex?).

@adam-c-anderson
Copy link
Author

If it's working as designed, then maybe all that's needed is to clarify the linked documentation, adding a bullet point to the bottom of the page to indicate that all numbers are inferred as decimal unless explicitly cast.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants