You can use local LLMs with MindMac via LMStudioarrow-up-right, GPT4Allarrow-up-right or Ollamaarrow-up-right+LiteLLMarrow-up-right. Please check below videos for more details.
How to use local LLMs in MindMac by using LMStudio?arrow-up-right
How to use local LLMs with GPT4All & MindMac?arrow-up-right
How to use Ollama with MindMac using LiteLLM?arrow-up-right
Now you can use Ollama without LiteLLM. Please visit Add Ollama endpoint.
Last updated 2 years ago