Serge - LLaMa Made Easy π¦: A Self-Hosted Chat Interface for Alpaca Models
Serge is a chat interface that makes running Alpaca models easy. Based on llama.cpp, it is entirely self-hosted, which means that no API keys are needed. The best part? It fits on just 4GB of RAM and runs on the CPU.
With a SvelteKit frontend, MongoDB for storing chat history and parameters, and FastAPI + beanie for the API, wrapping calls to llama.cpp, Serge is a complete package for anyone looking to run Alpaca models easily.
Getting Started with Serge
Setting up Serge is a breeze, and running it with Alpaca 7B is a simple four-step process:
- Clone the 'serge' repository and navigate to the directory:
git clone https://github.com/nsarrazin/serge.git && cd serge
- Copy the '.env.sample' file to '.env':
cp .env.sample .env
- Start the Docker containers:
docker compose up -d
- Download the Alpaca 7B model:
docker compose exec api python3 /usr/src/app/utils/download.py tokenizer 7B
And that's it! Head over to http://localhost:8008/ to start using Serge.
Supported Models
Currently, Serge only supports the Alpaca 7B, 13B, and 30B models. However, there is a download script inside the container for downloading these models. If you have existing weights from another project, you can add them to the 'serge_weights' volume using 'docker cp'.
Support and What's Next
If you need help with setting up Serge, feel free to join the Discord community at
https://discord.gg/62Hc6FEYQH.
The developers of Serge have a lot of exciting plans for the future, including user profiles and authentication, different prompt options, LangChain integration with a custom LLM, and support for other llama models and quantization.
Serge is a fantastic tool for anyone looking to run Alpaca models easily. With its simple setup process and self-hosted nature, it is perfect for anyone who wants complete control over their models. With a friendly community to provide support, Serge is an excellent choice for anyone looking to get started with Alpaca models.
Top comments (0)