Using KoboldCpp
Learn how to set up and configure KoboldCpp as a local AI backend for Arcania
You can find the full KoboldCpp documentation here.
Installation Steps
-
Clone the repo:
-
Download the model: For example, we will use OpenChat 3.5 model, which is what is used on the demo instance. There are many models to choose from.
Navigate to TheBloke/openchat_3.5-GGUF and download one of the models, such as
openchat_3.5.Q5_K_M.gguf
. Place this file inside the./models
directory. -
Build KoboldCpp:
-
Run the server:
Configuration
-
Select KoboldCpp as the backend:
-
Configure KoboldCpp:
-
Enable extra features: Inside of "Use KoboldCpp" ensure that "Use Extra" is enabled. This will allow you to use the extra features of KoboldCpp, such as streaming.