Replies: 1 comment
-
I don't think prefetcher would get in the way of that, prefetcher takes care of a few of the last items in the transform chain, so using it changes the transforms a bit. There is a small boost from using it but if you can't figure out how to use it with your changes it is fine to disable it, see case in train script and create_loader/transforms where |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I'd like to know what is prefetcher, in my opinion, prefetcher means pre load the picture, so train can be faster.
But I try to change train scripts, because I want to try the progressing learning( different epoch has different input-size), so I put the loader in the for loop, but I found it can't use, but if I forbid the prefetcher, it can work well, I'd like to know why.
Beta Was this translation helpful? Give feedback.
All reactions