-
Notifications
You must be signed in to change notification settings - Fork 101
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Support for new microbatching incremental strategy #715
Comments
We are planning to ship this out so I'm going to close out this issue for now |
@amychen1776 the reason why this issue was closed is not clear. Even if dbt Labs is planning to ship such feature, a public open issue could be still relevant to have to track the work that is planned around the adapter - this is valid for any open source code-base, issues are a way to track planned work, or simply to understand relevant features to prioritize by the maintainers.
|
Looks like this issue is tracked here dbt-labs/dbt-core#10624 |
@nicor88 that's correct - I closed this because this work is being done and tracked in the other issue (and I want to limit duplication) |
@amychen1776 could we aim to cross reference issues next time? this microbatch issue is linked in few conversation with the community, and it's good to have a track on what is going on. |
Is this your first time submitting a feature request?
Describe the feature
Previously, for large time-series models we needed a custom
insert-by-period
macro. In this post dbt announced support for the "microbatching" strategy coming to dbt-core 1.9.This issue is a placeholder to start preparing for the dbt 1.9 release with this new incremental strategy. Work can be tracked in this epic dbt-labs/dbt-core#10624
Describe alternatives you've considered
No response
Who will this benefit?
Datasets that are too big to run on initial full load without batching (Athena query times out)
Are you interested in contributing this feature?
No response
Anything else?
Closes #190 and probably #697
The text was updated successfully, but these errors were encountered: