jeremy ardley wrote:

>> Here, in this thread, the context thing with respect to AI,
>> anyone having any luck knowing what to do with that? It is
>> mentioned 14 times in llamafile(1) but not how to actually
>> set it up with your own data?
>
> One way to set context is via the http api which is openai compatible.
>
> You create the queries using python and include your context
> with the query. It may have the ability to remember context
> so you only need it once, but running locally you can resend
> the context each time.

You can, but how much?

So this is the context? You mean include it in the prompt?

Then it is more easy to find in the llamafile(1) man page, it
is probably this

     -c N, --ctx-size N
             Set the size of the prompt context. A larger
             context size helps the model to better comprehend
             and generate responses for longer input or
             conversations. The LLaMA models were built with
             a context of 2048, which yields the best results
             on longer input / inference.

If the unit is bytes that means 2048 bytes.

Okay, that I can try it right now just by inserting all the
data from a file into the query (prompt) and ask. And everyone
can try that, actually.

-- 
underground experts united
https://dataswamp.org/~incal

Reply via email to