Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The perplexity tool returns unexpected ppl results #65

Open
fefang opened this issue Oct 18, 2024 · 1 comment
Open

The perplexity tool returns unexpected ppl results #65

fefang opened this issue Oct 18, 2024 · 1 comment

Comments

@fefang
Copy link

fefang commented Oct 18, 2024

Code commit id:1d310938cd705b467293b230bb04eb44d77d78f5.
I try to use the perplexity tool to evaluate the ppl of my model, but the tool returns ppl value of nan which is unexpected.
My command is as follows
/llama-perplexity -m model.gguf -f wikitext-2-raw/wiki.test.raw

My gguf works fine and give reasonable result most times, when i am testing it with some customized prompts.
So I wanda whether the perplexity tool works fine in this project?

Can you help me look into this issue?

@QingtaoLi1
Copy link
Contributor

@fefang Sorry for late reply. Which model are you testing? We find the ppl abnormal when testing some EfficientQAT models, mostly g128 models, but the value is not NAN but very large. The reason is different vocab size. You can see #70 for more details.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants