What do the colors on each token mean?
danielos123ssa opened this issue · 1 comments
I wish you'd be clear on that because I am interested. Is there a legend of sorts?
I don't have the expertise to confirm exactly how the neural network processes the text, but I can explain how it breaks down into numbers for the network.
For example, the word emotionless
is actually 3 tokens to the network (emo, tion, less). By default, SD takes 75 tokens at a time, as explained in a1111's wiki here.
If you put emotionless
at the very end of your first chunk, it will be split like this:
- Chunk 1, Token 74: emo
- Chunk 1, Token 75: tion
- Chunk 2, Token 1: less
This might not yield the desired result. Whether the network understands the term "emotionless" as intended is uncertain. However, if the wiki is correct, splitting emotionless
into two chunks will for sure not produce the desired outcome.
Here is a screenshot with my PR:
#9