đĨī¸Use local LLM in MindMac
Last updated
Last updated
You can use local LLMs with MindMac via LMStudio, GPT4All or Ollama+LiteLLM. Please check below videos for more details.
Now you can use Ollama without LiteLLM. Please visit Add Ollama endpoint.