How to best query for paginated infinite scroll #106
Replies: 2 comments 4 replies
-
Fetch only the IDs in a single query, then retrieve the single element (or a batch of elements if you prefer) using the ID at a given index path. This approach is similar to how Core Data operates under the hood. Even with thousands of rows, memory usage remains minimal, especially if your IDs are integers. Additionally, this method provides a better user experience, as users can immediately have a sense of the size of the list by looking at the scroll bar's size. |
Beta Was this translation helpful? Give feedback.
-
Hi @cameronmcefee, we don't have any tools built into the library for handling pagination. I personally think you should start by just loading everything into memory and see how it goes. If it turns out there is no performance problem (make sure to run in release, not debug), then you've just saved yourself a lot of work. Further, our tools make it very easy to load just the small bits of data the rows in your list needs. For example, if your table has many columns and some columns hold lots of data, but all you need is the title to display in the row, then you can simply do: @FetchAll(Item.select(\.title)) var itemTitles And if you need two or more columns from the table, then you can define a custom @Selection
struct Row {
let title: String
let isFlagged: Bool
}
@FetchAll(
Item.select {
Row.Columns(title: $0.title, isFlagged: $0.isFlagged)
}
) This greatly reduces the amount of decoding that has to be done from SQLite and reduces memory usage. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello! I'm curious if someone can guide me as to the best approach to querying data for infinite scrolling content. In theory the schema specifics aren't critical info here, and I'm not asking for code. I'm interested in the strategy, not the nuance. Let's just assume there's an ordered list of fairly simple records (but possibly thousands of them), and as you get to the bottom of the list you want to load N more items.
The most straightforward approach for me to reason about is that you're always fetching from 0 to a limit, where the limit is a variable you increase for each new query. Is SQLite fast enough that this is viable to a reasonable extent, or do the queries begin to slow down noticeably as the quantity increases?
Alternative 1: Have an offset and a limit, and you continually move the offset, but that gets into virtualizing the list which I would love to avoid.
Alternative 2: Or there's the idea of doing the queries in limited chunks and appending that to an ever growing array somewhere but that introduces a new thing to keep in sync.
And then there's the question of making it bi-directional which I assume might change the reasoning.
I don't have a CS background but I feel like this is the sort of thing that someone with a CS degree would be able to say "well obviously you start by trying X". Can anyone set me in the right direction?
Thanks for your guidance!
Beta Was this translation helpful? Give feedback.
All reactions