On 2023-12-31 05:22, Mo Zhou wrote:
I am not
able to develop DebGPT and confess I am not investing my time in
learning to do it. But can we attract the people who want to tinker in
this direction?
Debian funds should be able to cover the hardware requirement and
training expenses even if they are slightly expensive. The more
expensive thing is the time of domain experts. I can train such a
model but clearly I do not have bandwidth for that.
No. I changed my mind.
I can actually quickly wrap some debian-specific prompts with an
existing chatting LLM. This is easy and does not need expensive hardware
(although it may still require 1~2 GPUs with 24GB memory for inference),
nor any training procedure.
The project repo is created here
https://salsa.debian.org/deeplearning-team/debgpt
An alternative to fine tuning would be to use RAG (with LangChain for
example).