Last updated 1 year ago
You can use local LLMs with MindMac via LMStudio, GPT4All or Ollama+LiteLLM. Please check below videos for more details.
How to use local LLMs in MindMac by using LMStudio?
How to use local LLMs with GPT4All & MindMac?
How to use Ollama with MindMac using LiteLLM?
Now you can use Ollama without LiteLLM. Please visit Add Ollama endpoint.