Difference between training with one token and several tokens #692
Renaldas111
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
As I have read, it is advised to find a rare token and train your subject/style on this token. I have tried this approach, also tried the approach by training to a long word consisting from many tokens (I do this by writing long Russian words in latin, most of these are 4-7 tokens and have no constant result in txt2img). I noticed, that training the concept to a long tokenized word usually gives me better result in comparison to the one tokened, like"sks".
My intuition tells me that this should happen, as it is easier to "code" the concept via large amount of tokens.
Am I right, have you tried to train your concept to a "word" with 5-10 tokens and compare the result with a training to 1 token?
Beta Was this translation helpful? Give feedback.
All reactions