This content originally appeared on DEV Community and was authored by Aniket Vaishnav
Here's a quick guide on how to set-up Ollama 3 upon your local Machine.
But before moving forward, please make sure, you atleast have 8 to 12 GB of RAM.
TL;DR
Get summary within seconds :: Spin ChatGPT like LLM, ollama, setup on laptop within 5 mins.
Important links:
Official Homepage | https://ollama.com/ |
---|---|
Ollama Github page | https://github.com/ollama/ |
Ollama Docker Hub | https://hub.docker.com/r/ollama/ollama |
Ollama Model Library | https://ollama.com/library |
Ollama C++ Inference | https://github.com/ggerganov/llama.cpp |
Ollama Python Lib | https://github.com/ollama/ollama-python |
Ollama JS Lib | https://github.com/ollama/ollama-js |
Let's begin.
Go to the official docker hub repository of ollama
official docker hub :: https://hub.docker.com/r/ollama/ollama
Pull this image with
docker pull ollama/ollama
and if you wanted to run CPU only mode then try running
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
post this you are done!
It will start your prompt with CPU only mode.
Leverage GPU?
But if you wanted to leverage your GPU as well, then stick further.
curl -fsSL https://nvidia.github.io/libnvidia-container/gpgkey \
| sudo gpg --dearmor -o /usr/share/keyrings/nvidia-container-toolkit-keyring.gpg
curl -s -L https://nvidia.github.io/libnvidia-container/stable/deb/nvidia-container-toolkit.list \
| sed 's#deb https://#deb [signed-by=/usr/share/keyrings/nvidia-container-toolkit-keyring.gpg] https://#g' \
| sudo tee /etc/apt/sources.list.d/nvidia-container-toolkit.list
sudo apt-get update
Install Nvidia related stuff
Install container toolkit
sudo apt-get install -y nvidia-container-toolkit
Setup docker to use Nvidia Drivers
sudo nvidia-ctk runtime configure --runtime=docker
sudo systemctl restart docker
Done, start the container
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
Also do not forget to try more model's upon Ollama library
This content originally appeared on DEV Community and was authored by Aniket Vaishnav
Aniket Vaishnav | Sciencx (2024-07-14T13:10:16+00:00) Setting up ollama 3. Retrieved from https://www.scien.cx/2024/07/14/setting-up-ollama-3/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.