-
Notifications
You must be signed in to change notification settings - Fork 202
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Do you have plans to support streaming in near future? Interested in readStream use-case spark.readStream.format("bigquery") #259
Comments
Streaming is on our roadmap, can you please elaborate more on your use case? Please feel free to contact us directly |
Hi we have data flowing directly into big-query(via fluentd) in real-time. Note: The reads will be from view. |
@davidrabinowitz any thoughts? Is it possible to use any timestamp or any offset? |
@nmusku Yes, for the time being you can implement it with a query like you've suggested. BTW, you can also merge it:
|
ok one more ques are the events in big query ordered? |
Are there any news on this? now with GA4 would be cool to get streaming integration in spark |
Hi @davidrabinowitz , I am also interested in a readStream feature. We have one ETL pipeline extracting campaign data from BigQuery and load data into our DeltaLake. The struggle we face is to do incremental ETL without loading duplicated data into our deltalake. With readStream and checkpoint, hopefully this will be solved. Could you maybe share more information on the timeline for readStream feature? |
We are also interested in this use case. |
@davidrabinowitz any update on this topic? we're also interested in this. |
Can you please elaborate on the use case, especially how to want to read? |
@davidrabinowitz our use case is streaming read the incremental data from bigquery tables, something like, spark.readStream.format("bigquery").option("inc_col", "create_time"), and we can config the incremental column, each time it will only read the newly added data. do we support this now? any suggestions? |
If not how can I do it with current connector? Any thoughts?
The text was updated successfully, but these errors were encountered: