Hi!
I've noticed that the most recent LLMs are actually very good at
finding information, summarizing and giving code examples about Debian
already without extra training. One just needs to double check that
the answer was correct to not fall for when they are simply making up
plausible sounding
Am Wed, Jan 03, 2024 at 01:06:25PM +0100 schrieb Andrey Rakhmatullin:
> On Wed, Jan 03, 2024 at 11:33:06AM +0200, Andrius Merkys wrote:
> > On 2024-01-03 11:12, Andrey Rakhmatullin wrote:
> > > On Wed, Jan 03, 2024 at 09:58:33AM +0200, Andrius Merkys wrote:
> > > > To me the most time consuming
debgpt v0.4.90 has been uploaded to NEW, targeting at unstable.
This tool is still under development, new features will be added later.
Usage examples can be found in debgpt(1) or README.md .
They are the same file.
During my (limited number) of experiments when developing this tool,
LLMs are
I have implemented the OpenAI API frontend, with streaming to terminal
enabled. Just export your OPENAI_API_KEY to environment if you have one,
and specify `-F openai` in the debgpt command line. It work work without
the self-hosted LLM inference backend.
That means the command `debgpt none -i
On Wed, Jan 03, 2024 at 11:33:06AM +0200, Andrius Merkys wrote:
> On 2024-01-03 11:12, Andrey Rakhmatullin wrote:
> > On Wed, Jan 03, 2024 at 09:58:33AM +0200, Andrius Merkys wrote:
> > > To me the most time consuming task in Debian recently is the Python
> > > transitions. I wonder whether DebGPT
On 2024-01-03 11:12, Andrey Rakhmatullin wrote:
On Wed, Jan 03, 2024 at 09:58:33AM +0200, Andrius Merkys wrote:
To me the most time consuming task in Debian recently is the Python
transitions. I wonder whether DebGPT could help with them. Maybe there are
other, non-Debian-specific GPTs for this
On Wed, Jan 03, 2024 at 09:58:33AM +0200, Andrius Merkys wrote:
> To me the most time consuming task in Debian recently is the Python
> transitions. I wonder whether DebGPT could help with them. Maybe there are
> other, non-Debian-specific GPTs for this task, but I would prefer a Debian
> one.
As
> Installation and setup guide can be found in docs/.
Is it planed to package transformers in Debian instead of using conda/mamba
venv for this installation ?
* It would be great to help with the Debian patch workflow.
- upstream status
- find upstream bug equivalent to a Debian bug
Hi,
On 2024-01-03 00:07, M. Zhou wrote:
Following what has been discussed in d-project in an offtopic
subthread, I prepared some demo on imagined use cases to
leverage LLMs to help debian development.
https://salsa.debian.org/deeplearning-team/debgpt
I find this pretty impressive. Thanks a
On 2024-01-02 17:07:57 -0500 (-0500), M. Zhou wrote:
[...]
> You can also tell me more ideas on how we can interact with LLM
> for debian-specific tasks. It is generally not difficult to
> implement. The difficulty stems from the hardware capacity, and
> hence the context length. Thus, the client
Hi folks,
Following what has been discussed in d-project in an offtopic
subthread, I prepared some demo on imagined use cases to
leverage LLMs to help debian development.
https://salsa.debian.org/deeplearning-team/debgpt
To run the demo, the least requirement is a CUDA GPU with
> 6GB memory. You
11 matches
Mail list logo