Skip to content

Is it possible to support the between operator for SparkDateTime UDT #4

@hbutani

Description

@hbutani

A predicate like 'dateTime(o_orderdate) between dateTime("1995-01-01") and dateTime("1996-12-31")'
gives the following error.

org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 18.0 failed 1 times, most recent failure: Lost task 0.0 in stage 18.0 (TID 1961, localhost): java.lang.RuntimeException: Type org.sparklinedata.spark.dateTime.SparkDateTimeUDT@52d6ab8c does not support ordered operations
at scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.catalyst.expressions.GreaterThanOrEqual.ordering$lzycompute(predicates.scala:309)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions