How long is it supposed to take to train on ImageNet21k for 90 epochs with 8 V100 GPUs? #1556
Unanswered
Phuoc-Hoan-Le
asked this question in
Q&A
Replies: 1 comment 1 reply
-
@CharlesLeeeee it depends entirely on the specific model and throughput on the V100. 22/21k (or filtered subsets) are in the 12-14M sample range, so 10x ImageNet-1k. So you can get rough time by multiplying your ImageNet epoch times by 10-11x |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I cannot reproduce results finetuning on ImageNet1k after pretraining on ImageNet21k. How long does it usually take to pretrain on ImageNet21K with 8 V100 GPUs to get the same performance on ImageNet1k that you currently have finetuning with ViT-B/S/T?
From DeiT III: Revenge of the ViT (https://arxiv.org/pdf/2204.07118.pdf), how long exactly (exact number of hours) does it take to pretrain for 90 epochs on ImageNet21k with 8 V100 GPUs.
Beta Was this translation helpful? Give feedback.
All reactions