Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve Persistence.Query Memory Usage #45

Open
to11mtm opened this issue Nov 30, 2021 · 0 comments
Open

Improve Persistence.Query Memory Usage #45

to11mtm opened this issue Nov 30, 2021 · 0 comments

Comments

@to11mtm
Copy link
Member

to11mtm commented Nov 30, 2021

The current Implementation of Persistence.Query may not scale well on certain queries.

The biggest problem is that we are using .ToList() in places that in large systems may return large results.

We have two options. The first of which is a 'universal' fix and can easily be implemented as an opt-in feature, the second will require a little more work but will be better for users long term.

  1. Change Persistence.Query methods that call .ToListAsync() to instead stream off an IAsyncEnumerable, via https://gist.github.com/to11mtm/dc9a350080fcbcb14098c14509d70e7f
  2. Refactor methods that can use the 'batched chunk' read style used by MessagesWithBatch

Option 1 is the 'safest' and should work with all DBs with the possible exception of SQLite (because of it's tendencies for readers and writers to block each other.) Option 2 will be better performing.

I'm expecting that the end state will be that most queries will be better under 'batched' reads, but there will be one or two (I'm thinking mainly CurrentPersistenceIds()) that will still benefit from the option of switching.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant