-
Notifications
You must be signed in to change notification settings - Fork 9
Update quickbooks with realistic data type error #95
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from 5 commits
9b60fe5
b4ccdfe
9ef2f2e
bb3d6e7
efc2f6e
07e24ed
3450807
12c9c39
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,10 @@ | ||
| #!/bin/bash | ||
|
|
||
| # Update primary schema in dbt_project.yml file | ||
| yq -i '.vars.quickbooks_schema = "public"' dbt_project.yml | ||
|
|
||
| # Copy Snowflake-specific solution models that handle epoch-to-timestamp conversion | ||
| MIGRATION_DIR="$(dirname "$(readlink -f "${BASH_SOURCE}")")" | ||
| cp $MIGRATION_DIR/solutions/stg_quickbooks__refund_receipt.sql solutions/ | ||
| cp $MIGRATION_DIR/solutions/stg_quickbooks__sales_receipt.sql solutions/ | ||
| cp $MIGRATION_DIR/solutions/stg_quickbooks__estimate.sql solutions/ |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,61 @@ | ||
| --To disable this model, set the using_estimate variable within your dbt_project.yml file to False. | ||
| {{ config(enabled=var('using_estimate', True)) }} | ||
|
|
||
| with base as ( | ||
|
|
||
| select * | ||
| from {{ ref('stg_quickbooks__estimate_tmp') }} | ||
|
|
||
| ), | ||
|
|
||
| fields as ( | ||
|
|
||
| select | ||
| /* | ||
| The below macro is used to generate the correct SQL for package staging models. It takes a list of columns | ||
| that are expected/needed (staging_columns from dbt_quickbooks_source/models/tmp/) and compares it with columns | ||
| in the source (source_columns from dbt_quickbooks_source/macros/). | ||
| For more information refer to our dbt_fivetran_utils documentation (https://github.com/fivetran/dbt_fivetran_utils.git). | ||
| */ | ||
|
|
||
| {{ | ||
| fivetran_utils.fill_staging_columns( | ||
| source_columns=adapter.get_columns_in_relation(ref('stg_quickbooks__estimate_tmp')), | ||
| staging_columns=quickbooks_source.get_estimate_columns() | ||
| ) | ||
| }} | ||
|
|
||
| {{ | ||
| fivetran_utils.source_relation( | ||
| union_schema_variable='quickbooks_union_schemas', | ||
| union_database_variable='quickbooks_union_databases' | ||
| ) | ||
| }} | ||
|
|
||
| from base | ||
| ), | ||
|
|
||
| final as ( | ||
|
|
||
| select | ||
| cast(id as {{ dbt.type_string() }}) as estimate_id, | ||
| cast(class_id as {{ dbt.type_string() }}) as class_id, | ||
| created_at, | ||
| currency_id, | ||
| customer_id, | ||
| cast(department_id as {{ dbt.type_string() }}) as department_id, | ||
| -- Convert unix epoch to timestamp, then truncate to date | ||
| cast( {{ dbt.date_trunc('day', 'TO_TIMESTAMP_NTZ(due_date)') }} as date) as due_date, | ||
| exchange_rate, | ||
| total_amount, | ||
| cast( {{ dbt.date_trunc('day', 'transaction_date') }} as date) as transaction_date, | ||
| transaction_status, | ||
| _fivetran_deleted, | ||
| source_relation | ||
| from fields | ||
| ) | ||
|
|
||
| select * | ||
| from final | ||
| where not coalesce(_fivetran_deleted, false) | ||
|
|
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,62 @@ | ||
| --To disable this model, set the using_refund_receipt variable within your dbt_project.yml file to False. | ||
| {{ config(enabled=var('using_refund_receipt', True)) }} | ||
|
|
||
| with base as ( | ||
|
|
||
| select * | ||
| from {{ ref('stg_quickbooks__refund_receipt_tmp') }} | ||
|
|
||
| ), | ||
|
|
||
| fields as ( | ||
|
|
||
| select | ||
| /* | ||
| The below macro is used to generate the correct SQL for package staging models. It takes a list of columns | ||
| that are expected/needed (staging_columns from dbt_quickbooks_source/models/tmp/) and compares it with columns | ||
| in the source (source_columns from dbt_quickbooks_source/macros/). | ||
| For more information refer to our dbt_fivetran_utils documentation (https://github.com/fivetran/dbt_fivetran_utils.git). | ||
| */ | ||
|
|
||
| {{ | ||
| fivetran_utils.fill_staging_columns( | ||
| source_columns=adapter.get_columns_in_relation(ref('stg_quickbooks__refund_receipt_tmp')), | ||
| staging_columns=quickbooks_source.get_refund_receipt_columns() | ||
| ) | ||
| }} | ||
|
|
||
| {{ | ||
| fivetran_utils.source_relation( | ||
| union_schema_variable='quickbooks_union_schemas', | ||
| union_database_variable='quickbooks_union_databases' | ||
| ) | ||
| }} | ||
|
|
||
| from base | ||
| ), | ||
|
|
||
| final as ( | ||
|
|
||
| select | ||
| cast(id as {{ dbt.type_string() }}) as refund_id, | ||
| balance, | ||
| cast(doc_number as {{ dbt.type_string() }}) as doc_number, | ||
| total_amount, | ||
| cast(class_id as {{ dbt.type_string() }}) as class_id, | ||
| cast(deposit_to_account_id as {{ dbt.type_string() }}) as deposit_to_account_id, | ||
| created_at, | ||
| cast(department_id as {{ dbt.type_string() }}) as department_id, | ||
| cast(customer_id as {{ dbt.type_string() }}) as customer_id, | ||
| currency_id, | ||
| exchange_rate, | ||
| -- Convert unix epoch to timestamp, then truncate to date | ||
| cast( {{ dbt.date_trunc('day', 'TO_TIMESTAMP_NTZ(transaction_date)') }} as date) as transaction_date, | ||
| _fivetran_deleted, | ||
| source_relation | ||
| from fields | ||
| ) | ||
|
|
||
| select * | ||
| from final | ||
| where not coalesce(_fivetran_deleted, false) | ||
|
|
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,60 @@ | ||
| {{ config(enabled=var('using_sales_receipt', True)) }} | ||
|
|
||
| with base as ( | ||
|
|
||
| select * | ||
| from {{ ref('stg_quickbooks__sales_receipt_tmp') }} | ||
|
|
||
| ), | ||
|
|
||
| fields as ( | ||
|
|
||
| select | ||
| /* | ||
| The below macro is used to generate the correct SQL for package staging models. It takes a list of columns | ||
| that are expected/needed (staging_columns from dbt_quickbooks_source/models/tmp/) and compares it with columns | ||
| in the source (source_columns from dbt_quickbooks_source/macros/). | ||
| For more information refer to our dbt_fivetran_utils documentation (https://github.com/fivetran/dbt_fivetran_utils.git). | ||
| */ | ||
|
|
||
| {{ | ||
| fivetran_utils.fill_staging_columns( | ||
| source_columns=adapter.get_columns_in_relation(ref('stg_quickbooks__sales_receipt_tmp')), | ||
| staging_columns=quickbooks_source.get_sales_receipt_columns() | ||
| ) | ||
| }} | ||
|
|
||
| {{ | ||
| fivetran_utils.source_relation( | ||
| union_schema_variable='quickbooks_union_schemas', | ||
| union_database_variable='quickbooks_union_databases' | ||
| ) | ||
| }} | ||
| from base | ||
| ), | ||
|
|
||
| final as ( | ||
|
|
||
| select | ||
| cast(id as {{ dbt.type_string() }}) as sales_receipt_id, | ||
| balance, | ||
| cast(doc_number as {{ dbt.type_string() }}) as doc_number, | ||
| total_amount, | ||
| cast(deposit_to_account_id as {{ dbt.type_string() }}) as deposit_to_account_id, | ||
| created_at, | ||
| cast(customer_id as {{ dbt.type_string() }}) as customer_id, | ||
| cast(department_id as {{ dbt.type_string() }}) as department_id, | ||
| cast(class_id as {{ dbt.type_string() }}) as class_id, | ||
| currency_id, | ||
| exchange_rate, | ||
| -- Convert unix epoch to timestamp, then truncate to date | ||
| cast( {{ dbt.date_trunc('day', 'TO_TIMESTAMP_NTZ(transaction_date)') }} as date) as transaction_date, | ||
| _fivetran_deleted, | ||
| source_relation | ||
| from fields | ||
| ) | ||
|
|
||
| select * | ||
| from final | ||
| where not coalesce(_fivetran_deleted, false) | ||
|
|
|
Collaborator
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. As a broader note on these migrations, if the change that's needed is fairly small (eg, replace a function with another one), I try to do it inline in the script, because it's easier to see what's changing. Full replacing a file can be hard to follow, since you don't know what's different between things. That's a judgement call though. How much are we needing to update this?
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. These files aren't in the original project, because they're initially installed from the package hub. I guess they could be copied from inside of the Docker container's project's packages directory and then have the one-line change applied, but that feels kinda janky too. |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,10 @@ | ||
| #!/bin/bash | ||
|
|
||
| # Update primary schema in dbt_project.yml file | ||
| yq -i '.vars.quickbooks_schema = "public"' dbt_project.yml | ||
|
|
||
| # Copy Snowflake-specific solution models that handle epoch-to-timestamp conversion | ||
| MIGRATION_DIR="$(dirname "$(readlink -f "${BASH_SOURCE}")")" | ||
| cp $MIGRATION_DIR/solutions/stg_quickbooks__refund_receipt.sql solutions/ | ||
| cp $MIGRATION_DIR/solutions/stg_quickbooks__sales_receipt.sql solutions/ | ||
| cp $MIGRATION_DIR/solutions/stg_quickbooks__estimate.sql solutions/ |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,61 @@ | ||
| --To disable this model, set the using_estimate variable within your dbt_project.yml file to False. | ||
| {{ config(enabled=var('using_estimate', True)) }} | ||
|
|
||
| with base as ( | ||
|
|
||
| select * | ||
| from {{ ref('stg_quickbooks__estimate_tmp') }} | ||
|
|
||
| ), | ||
|
|
||
| fields as ( | ||
|
|
||
| select | ||
| /* | ||
| The below macro is used to generate the correct SQL for package staging models. It takes a list of columns | ||
| that are expected/needed (staging_columns from dbt_quickbooks_source/models/tmp/) and compares it with columns | ||
| in the source (source_columns from dbt_quickbooks_source/macros/). | ||
| For more information refer to our dbt_fivetran_utils documentation (https://github.com/fivetran/dbt_fivetran_utils.git). | ||
| */ | ||
|
|
||
| {{ | ||
| fivetran_utils.fill_staging_columns( | ||
| source_columns=adapter.get_columns_in_relation(ref('stg_quickbooks__estimate_tmp')), | ||
| staging_columns=quickbooks_source.get_estimate_columns() | ||
| ) | ||
| }} | ||
|
|
||
| {{ | ||
| fivetran_utils.source_relation( | ||
| union_schema_variable='quickbooks_union_schemas', | ||
| union_database_variable='quickbooks_union_databases' | ||
| ) | ||
| }} | ||
|
|
||
| from base | ||
| ), | ||
|
|
||
| final as ( | ||
|
|
||
| select | ||
| cast(id as {{ dbt.type_string() }}) as estimate_id, | ||
| cast(class_id as {{ dbt.type_string() }}) as class_id, | ||
| created_at, | ||
| currency_id, | ||
| customer_id, | ||
| cast(department_id as {{ dbt.type_string() }}) as department_id, | ||
| -- Convert unix epoch to timestamp, then truncate to date | ||
| cast( {{ dbt.date_trunc('day', 'TO_TIMESTAMP_NTZ(due_date)') }} as date) as due_date, | ||
| exchange_rate, | ||
| total_amount, | ||
| cast( {{ dbt.date_trunc('day', 'transaction_date') }} as date) as transaction_date, | ||
| transaction_status, | ||
| _fivetran_deleted, | ||
| source_relation | ||
| from fields | ||
| ) | ||
|
|
||
| select * | ||
| from final | ||
| where not coalesce(_fivetran_deleted, false) | ||
|
|
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,62 @@ | ||
| --To disable this model, set the using_refund_receipt variable within your dbt_project.yml file to False. | ||
| {{ config(enabled=var('using_refund_receipt', True)) }} | ||
|
|
||
| with base as ( | ||
|
|
||
| select * | ||
| from {{ ref('stg_quickbooks__refund_receipt_tmp') }} | ||
|
|
||
| ), | ||
|
|
||
| fields as ( | ||
|
|
||
| select | ||
| /* | ||
| The below macro is used to generate the correct SQL for package staging models. It takes a list of columns | ||
| that are expected/needed (staging_columns from dbt_quickbooks_source/models/tmp/) and compares it with columns | ||
| in the source (source_columns from dbt_quickbooks_source/macros/). | ||
| For more information refer to our dbt_fivetran_utils documentation (https://github.com/fivetran/dbt_fivetran_utils.git). | ||
| */ | ||
|
|
||
| {{ | ||
| fivetran_utils.fill_staging_columns( | ||
| source_columns=adapter.get_columns_in_relation(ref('stg_quickbooks__refund_receipt_tmp')), | ||
| staging_columns=quickbooks_source.get_refund_receipt_columns() | ||
| ) | ||
| }} | ||
|
|
||
| {{ | ||
| fivetran_utils.source_relation( | ||
| union_schema_variable='quickbooks_union_schemas', | ||
| union_database_variable='quickbooks_union_databases' | ||
| ) | ||
| }} | ||
|
|
||
| from base | ||
| ), | ||
|
|
||
| final as ( | ||
|
|
||
| select | ||
| cast(id as {{ dbt.type_string() }}) as refund_id, | ||
| balance, | ||
| cast(doc_number as {{ dbt.type_string() }}) as doc_number, | ||
| total_amount, | ||
| cast(class_id as {{ dbt.type_string() }}) as class_id, | ||
| cast(deposit_to_account_id as {{ dbt.type_string() }}) as deposit_to_account_id, | ||
| created_at, | ||
| cast(department_id as {{ dbt.type_string() }}) as department_id, | ||
| cast(customer_id as {{ dbt.type_string() }}) as customer_id, | ||
| currency_id, | ||
| exchange_rate, | ||
| -- Convert unix epoch to timestamp, then truncate to date | ||
| cast( {{ dbt.date_trunc('day', 'TO_TIMESTAMP_NTZ(transaction_date)') }} as date) as transaction_date, | ||
| _fivetran_deleted, | ||
| source_relation | ||
| from fields | ||
| ) | ||
|
|
||
| select * | ||
| from final | ||
| where not coalesce(_fivetran_deleted, false) | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Technically this isn't necessary because the duckdb dockerfile isn't used for any of the migrations you're working on here, but probably fine to leave to keep it consistent with the otehrs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The solution includes modifying dbt_project.yml to disable the overridden models, which is why this is now needed