DocsGetting StartedIkigenia PlatformLearning HubArcaniaGuidesUsing Ollama

Using Ollama

Learn how to set up and configure Ollama as a local AI backend for Arcania

You can find the full Ollama documentation here.

Installation Steps

Linux and WSL2

curl https://ollama.ai/install.sh | sh

Mac OSX

Download

Windows

Not yet supported

Setup Process

  1. Start the server:

    ollama serve
  2. Download a model: For example, we will use Mistral 7B. There are many models to choose from listed in the library.

    ollama run mistral
  3. Enable the server in the client:

    settings -> ChatBot -> ChatBot Backend -> Ollama