jeremy ardley wrote:

> the 2048 is tokens which is approximately the number of
> words in a prompt, so not character count.

Ah, right.

> The context explains how you want it to respond and the
> prompt is the actual question.

See the other mail, I don't know if the labels should look in
a certain way or anything?

> You can massively increase the size of strings to match
> using localdocs. I know how to do this in GPT4all. I assume
> there is a localdocs equivalent equivalent in llamafile?

I don't know, let's check it out!

-- 
underground experts united
https://dataswamp.org/~incal

Reply via email to