Using LM Studio
Learn how to set up and configure LM Studio as a local AI backend for Arcania
You can find the full LM Studio documentation here.
Installation Steps
-
Install LM Studio Navigate to the LM Studio website and follow the instructions to install the GUI.
-
Download a model Using the GUI, download a model from the LM Studio library. If you don't know which to pick, try
TheBloke/openchat_3.5.ggufversionopenchat_3.5.Q5.K_M.gguf. -
Start the server On the left side of the GUI, click the "Local Server" button. Then, in the dropdown on the top of the screen, select the model you downloaded.
Next, in the Server Options pane, ensure that Cross-Origin-Resource-Sharing (CORS) is enabled.
Finally, click "Start Server".
Configuration
-
Select ChatGPT as the backend:
settings -> ChatBot -> ChatBot Backend -> ChatGPT -
Configure ChatGPT settings:
settings -> ChatBot -> ChatGPTSet
OpenAI URLtohttp://localhost:8080andOpenAI Keytodefault. If you changed the port in the LM Studio GUI, use that port instead of8080.