You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Code commit id:1d310938cd705b467293b230bb04eb44d77d78f5.
I try to use the perplexity tool to evaluate the ppl of my model, but the tool returns ppl value of nan which is unexpected.
My command is as follows
/llama-perplexity -m model.gguf -f wikitext-2-raw/wiki.test.raw
My gguf works fine and give reasonable result most times, when i am testing it with some customized prompts.
So I wanda whether the perplexity tool works fine in this project?
Can you help me look into this issue?
The text was updated successfully, but these errors were encountered:
@fefang Sorry for late reply. Which model are you testing? We find the ppl abnormal when testing some EfficientQAT models, mostly g128 models, but the value is not NAN but very large. The reason is different vocab size. You can see #70 for more details.
Code commit id:1d310938cd705b467293b230bb04eb44d77d78f5.
I try to use the perplexity tool to evaluate the ppl of my model, but the tool returns ppl value of nan which is unexpected.
My command is as follows
/llama-perplexity -m model.gguf -f wikitext-2-raw/wiki.test.raw
My gguf works fine and give reasonable result most times, when i am testing it with some customized prompts.
So I wanda whether the perplexity tool works fine in this project?
Can you help me look into this issue?
The text was updated successfully, but these errors were encountered: