prerequisite
Install WSL
Install curl
sudo apt install curl
Install Ollama via curl
curl https://ollama.ai/install.sh | sh
Run Ollama
In this case, we will try to run Mistral-7B.
If you want to try another model, you can pick from the following site.
https://ollama.ai/library
ollama serve
Open another Terminal tab and run the following command. The following command will pull a model.
ollama run mistral
If everything works properly, you will see something like below.
My machine has a GPU, RTX3070. So Ollama is using the GPU.
Terminate Ollama
If you want to exit Ollama, you need to type the following.
/exit
Then ctrl + c in a terminal what you ran ollama serve.
Top comments (3)
No need WSL, Ollama runs natively on Windows machine beginning v.0.1.27.
jasonchuang.substack.com/p/ollama-...
yeah, but I think using WSL makes everything for devs.
Good job! Thanks for the article. But I just can't control myself. Sorry...