Skip to content

K fold cross validation? #1152

Discussion options

You must be logged in to vote

The parameter topk specifies the accuracy of the label being part of the top k predictions. For example, if the label is "boat" and the model predicts that the two most likely labels are, in order, ["bird", "boat"], then the prediction is wrong in the sense of normal accuracy -- "bird" is not the same as "boat" -- but correct in the top2 sense.

For ImageNet, it is quite common to report top five accuracy alongside "normal" accuracy.

Replies: 3 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by rwightman
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants
Converted from issue

This discussion was converted from issue #1151 on February 21, 2022 23:28.