Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

it would be a good idea to add a slightly longer LLM max length(tokens) #9

Open
LadyFlames opened this issue Aug 28, 2024 · 3 comments

Comments

@LadyFlames
Copy link

i think it might be a good idea to make the LLM max length a bit longer or more like let the LLM finish before it stops because half the LLM answer is missing or its incomplete most times and sometimes it reguires several dozen calls to the LLM until it gives a completed Answer and it should be changed to something like 750-1000 Max lenght(Tokens) maybe even higher
incomplete

@xlinx
Copy link
Owner

xlinx commented Aug 29, 2024

okay, got it. now is 5k, and also top_k and top_p next version.
when u click generate button; it will send into SD, do u really want send more then 1000 token into SD clip encoder?

update...
after i try for FLUX. hnn, so many token bring so many details.
yep, we need .

@LadyFlames
Copy link
Author

LadyFlames commented Sep 2, 2024

i did some testing myself in SD forge using Flux i got this result with 500 Max length Tokens and 20 top k and 0.9 top p and 0.7 LLM temperature
00011-42929162
more max length tokens might be better to be used with flux

@xlinx
Copy link
Owner

xlinx commented Sep 2, 2024

Its 5000, now. Max token has set max to 5000 few day ago.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants