-
Notifications
You must be signed in to change notification settings - Fork 152
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat(sync-service): Streaming the response to the client without wait…
…ing for the snapshot to finish (#1517) Previously shapes with one million rows or more (170MB+) would timeout waiting for the snapshot to be created. Now the snapshot is streamed from the database into storage while simultaneously being streamed from storage to the client. This means the first packets of the response can be sent without waiting for the snapshot to finish. This means we now support tables with over a million rows. Ilia is currently benchmarking this branch to see what the new limits are. This PR addresses #1438 and #1444 I think there are quite a few simplifications that could happen off the back of this change, but I have kept the refactoring to a minimum in this PR and will instead address the simplifications in separate PRs.
- Loading branch information
Showing
13 changed files
with
458 additions
and
164 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
--- | ||
"@core/sync-service": patch | ||
--- | ||
|
||
Support larger shapes (1 million row, 170MB +) and faster time to first byte |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
defmodule Electric.ConcurrentStream do | ||
@default_poll_time 10 | ||
|
||
@doc """ | ||
Allows concurrent reading while writing of a stream. | ||
There can be mutiple reading processes however there must be only one writing process. | ||
The writing process must append an end marker to the end of the stream when it has finished | ||
to signal to the reading processes that the stream has ended. | ||
If a read process runs out of data to read before the end marker has been written | ||
it waits the `poll_time_in_ms` for more data to be written, then resumes the stream | ||
with the `stream_fun`. | ||
""" | ||
|
||
def stream_to_end(opts) do | ||
excluded_start_key = Keyword.fetch!(opts, :excluded_start_key) | ||
end_marker_key = Keyword.fetch!(opts, :end_marker_key) | ||
stream_fun = Keyword.fetch!(opts, :stream_fun) | ||
|
||
stream_fun.(excluded_start_key, end_marker_key) | ||
|> continue_if_not_ended(excluded_start_key, opts) | ||
end | ||
|
||
defp continue_if_not_ended(stream, latest_key, opts) do | ||
end_marker_key = Keyword.fetch!(opts, :end_marker_key) | ||
stream_fun = Keyword.fetch!(opts, :stream_fun) | ||
poll_time_in_ms = Keyword.get(opts, :poll_time_in_ms, @default_poll_time) | ||
|
||
[stream, [:premature_end]] | ||
|> Stream.concat() | ||
|> Stream.transform(latest_key, fn | ||
:premature_end, latest_key -> | ||
# Wait for more items to be added | ||
Process.sleep(poll_time_in_ms) | ||
|
||
# Continue from the latest_key | ||
stream = | ||
stream_fun.(latest_key, end_marker_key) | ||
|> continue_if_not_ended(latest_key, opts) | ||
|
||
{stream, latest_key} | ||
|
||
{^end_marker_key, _}, _latest_key -> | ||
{:halt, :end_marker_seen} | ||
|
||
{key, _value} = item, _latest_key -> | ||
{[item], key} | ||
end) | ||
end | ||
end |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.