Hello!
 
I know of two such efforts:
1. Petals (by bigscience) hosts large language models for inference and fine-tuning on a shared infrastructure.
You can inference the model or in the browser (https://chat.petals.dev/) or in the notebook (https://colab.research.google.com/drive/1uCphNY7gfAUkdDrTx21dZZwCOUDCMPw8?usp=sharing — colab example for 70B model)
 
2. Together Computer (by together.ai) build shared infrastructure for training models
they recently trained several 1-7B models that way (see https://together.ai/models)
 
This is (likely) not an exhaustive list. If you want more details, both of those projects have discord communities, so you can ask around to learn about similar projects.
 
Best,
Lena Wolf
 
24.10.2023, 22:39, "Amanda Stent via Corpora" <corpora@list.elra.info>:
So, everyone wants to host their own (copy of a) large language model (LLM), but many academic institutions can't spin up multiple LLMs simultaneously, in perpetuity, nor do I believe the Scientific Funding Agencies in each country would want to pay for everyone to get a GPU cluster just to host 500+ copies of tomorrow's version of LLAMA-2(ish). 
 
Are you aware of any effort proposing or planning to host LLMs for use by researchers in some shared infrastructure? After all, hosting the LLM costs the same per hour whether 1, 3 or 20 people are calling it, and at most academic institutions usage would be a little bursty.
 
Best,
Amanda Stent
 
--
(they/she)
Director, Davis Institute for AI
Professor, Computer Science
Colby College
 
Follow the Davis Institute for AI here
,

_______________________________________________
Corpora mailing list -- corpora@list.elra.info
https://list.elra.info/mailman3/postorius/lists/corpora.list.elra.info/
To unsubscribe send an email to corpora-le...@list.elra.info

_______________________________________________
Corpora mailing list -- corpora@list.elra.info
https://list.elra.info/mailman3/postorius/lists/corpora.list.elra.info/
To unsubscribe send an email to corpora-le...@list.elra.info

Reply via email to