Is there any way to increase the context window?
ZackBradshaw opened this issue · 4 comments
ZackBradshaw commented
Token indices sequence length is longer than the specified maximum sequence length for this model (356885 > 4096). Running this sequence through the model will result in indexing errors
ZackBradshaw commented
No I just just cleaned my data better still having loads of other issues
finetuning this but I've since then have switched models and am reworking
the data set.
…On Friday, August 9, 2024, Takumi_Oshita ***@***.***> wrote:
Did you find any solution about this problem?
—
Reply to this email directly, view it on GitHub
<#100 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AFCMWCSISIT5W3ZVTS5B643ZQTS2XAVCNFSM6AAAAABLK3WBO2VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENZYGI4TCNJYHA>
.
You are receiving this because you modified the open/close state.Message
ID: ***@***.***>
osttkm commented
Thank you for your reply.
It seems that the only option is to delete the relevant data from the JSON file. Thank you for sharing.
Lyken17 commented
We will release a long-context version soon. Please stay tuned.