Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slow #9

Open
kronic opened this issue Feb 8, 2021 · 6 comments
Open

Slow #9

kronic opened this issue Feb 8, 2021 · 6 comments

Comments

@kronic
Copy link

kronic commented Feb 8, 2021

There is a problem in wpf, with fast scrolling, many pages are requested and until all of them are received, the data is not displayed.

  1. I suggest displaying data not based on queue, but based on stack.
  2. Add custom pause for new page request. If the page has changed at the time of pause, then we do not request it.
@Yeah69
Copy link
Owner

Yeah69 commented Feb 13, 2021

More information would be helpful, but if I interpret it right, then I like your idea and would like to put it into this project.
So correct me if I am wrong, here comes my interpretation:
At first just to get the concept for simplicity we'll ignore preloading (no preloading). Same goes to page removal (hoarding; no LRU or custom page removal logic). The chosen fetcher kinds shouldn't matter much. But your suggestion is only relevant for async index access.
From user perspective this is what should happen:

  • User scrolls to approximately desired position (for example by dragging the scroll bar with the mouse)
  • Placeholder (progress bars/rings/whatever) are shown instantly everywhere
  • Where the user stops, the placeholders of these items of the ItemsControl are replaced by the "right" thing first

That is why I think that your suggestion will never be relevant for the sync index access. Without placeholders the UI would block until the first page is loaded anyway.

So technically speaking:

  • Alot of index requests in a relatively small timeframe
  • We generate (async) pages appropriately
    • As always it would create the placeholders right away
    • However instead of triggering the fetch of the right items from the backend in parallel right away (this leads to queue-like behavior/First In First Out FIFO)
    • We'll delay it (the 2. point of your suggestion)
    • This delay gives us time to collect the pending backend request and we'll priotize the most recent first (this leads to stack-like behavior/Last In First Out LIFO) => the 1. point of your suggestion

Like I mentioned before, I like that idea (thank you) and I also would like to integrate it in BFF.DVC. But I guess it will be some effort, so I cannot guarantee you that it will get done quickly. On top of the complexity of a naive configuration which I chose for this comment at the end I cannot just ignore preloading and page removal. I'll have to decide what to do about it. Either I'll figure out how to integrate these configurations or I'll exclude them for stack-/LIFO-mode. Another decision is whether to transition from FIFO to LIFO completely or offer both. That's all interesting and I am excited. What I know for sure is that this feature will mark a new major release!

Until then if you cannot wait that long. I think you could - to some extend - accomplished the desired behavior with the current version of BFF.DVC. The page fetchers are giving control from BFF.DVC back to the using developer/project (in this case you). Instead of fetching the page right away you could delay and priotize it yourself. Of course it would have some drawbacks. For example it would become complicated with LRU page removal and so on. But if you cannot wait I would suggest such an experiment for you in the mean time. However, if I am successful with this issue, then your effort of this experiment will go to waste. You'll have to decide whether that is worth your time.

By the way, I like the irony of the title of the issue being "Slow" and then suggesting pauses to make it faster. 👍

@Yeah69
Copy link
Owner

Yeah69 commented Feb 14, 2021

Intermediate results:
I think that I've come pretty far this weekend, but nothing pushable sofar. I'll need at least the next weekend, but I am on it.
And I cannot promise anything, but it seems that it won't be to complicated.

@kronic
Copy link
Author

kronic commented Feb 15, 2021

@Yeah69 Yes, you got it right. You can take your time, the main thing is high-quality implementation

@kronic
Copy link
Author

kronic commented Feb 15, 2021

in general, is it worth relying on fixed-width pages. for example, I drag the slider to a certain place with the mouse and get into a page break.
As I understand it, two requests will go to the backend. I would do this.
Option 1 is simple to combine two queries into one.
Option 2 is more difficult to set the page size from and to.

Yeah69 added a commit that referenced this issue Feb 27, 2021
Yeah69 added a commit that referenced this issue Feb 27, 2021
Yeah69 added a commit that referenced this issue Feb 27, 2021
Yeah69 added a commit that referenced this issue Feb 27, 2021
@Yeah69
Copy link
Owner

Yeah69 commented Mar 1, 2021

@kronic I've released the nuget package 3.3.3 which contains the proposed changes.
You can configure the functionality after configuring the fetchers and before the index access. The function is called ThrottledLifoPageRequests. It is the first optional configuration (default is immediate page requests), but I want to restructure the API soon and make configurations like preloading and page removal behavior optional as well. That'll mean breaking changes so I'll wait till then with a major number increment.

I want to answer to your last comment thoroughly later. Short answer now: I don't want to assume how BFF.DVC is used too much (I want to support as much use cases as possible). So I am going for high configurability. Fixed size pages are helping me a lot to keep the complexity low and at the same time configurability high.

@kronic
Copy link
Author

kronic commented Mar 1, 2021

@Yeah69 Thanks, I will test

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants