Skip to content

ClickHouseDialect, wrong definition of timestamp type #463

@EbiousVi

Description

@EbiousVi

Environment

  • OS version: MacOS 14
  • JDK version: 17
  • ClickHouse Server version: 23.10.6.60
  • Spark version: 3.5.0
  • Project dependencies
    • com.clickhouse:clickhouse-jdbc:jar:0.6.3:compile
    • com.github.housepower:clickhouse-integration-spark_2.12:jar:2.7.1:compile
    • com.github.housepower:clickhouse-spark-runtime-3.4_2.12:jar:0.7.3:compile

Steps to reproduce

  1. create table with type DateTime('Europe/Moscow')
    CREATE TABLE IF NOT EXISTS default.foo
    (
    application_id String,
    event_datetime DateTime('Europe/Moscow'),
    event_receive_datetime DateTime('Europe/Moscow')
    )
    ENGINE = Log

  2. Try to read table with Spark datasource api v2

When execute this code spark throws TABLE OR VIEW NOT FOUND exception. When I debugging the code, I realized that the case condition is not satisfied. Link to source code where it happens

Although the type matches the regular expression. I'm not an expert in Scala. Help me understand why this happens. Sample code attached.

java code.txt
pom.txt

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions