â“‚ī¸
MindMac Docs
Go to appBlogRoadmapRelease NotesFeature Request
  • 🌏Product Tour
    • â„šī¸What is MindMac?
    • ✨Getting started
    • 🧩Features
  • 👩‍đŸĢHow to...
    • 🚤Activate license
    • 🚲Deactivate license
    • 🔐Add API key
      • 🔑Create OpenAI API Key
      • 🔑Create Azure OpenAI API Key
      • 🔑Create Google AI API key
      • 🔑Create Google Cloud Vertex AI API Key
      • 🔑Create OpenRouter API Key
      • 🔑Create Mistral AI API Key
      • 🔑Create Perplexity API Key
      • 🔑Create Replicate API Key
      • 🔑Create Anyscale API key
    • đŸĻ™Add Ollama endpoint
    • 🌐Internet Browsing
      • 🍒Get Google Search Key
      • 🍊Get Tavily Key
      • 🍇Get You.com Search Key
      • đŸĨGet Serper Key
      • 🍍Get SerpApi Key
    • đŸ–Ĩī¸Use local LLM in MindMac
    • đŸ…°ī¸Change App Icon
    • 🧠Manage Models
  • FAQs
    • 👩‍đŸ’ģLicense code does not match product
    • 🐞Cannot open the app
    • đŸ’ĨInvalid URL or Key issue
    • ❓Common questions
Powered by GitBook
On this page
  • LMStudio
  • GPT4All
  • Ollama+LiteLLM
  1. How to...

Use local LLM in MindMac

PreviousGet SerpApi KeyNextChange App Icon

Last updated 1 year ago

You can use local LLMs with MindMac via , or +. Please check below videos for more details.

LMStudio

GPT4All

Ollama+LiteLLM

Now you can use Ollama without LiteLLM. Please visit Add Ollama endpoint.

👩‍đŸĢ
đŸ–Ĩī¸
LMStudio
GPT4All
Ollama
LiteLLM
How to use local LLMs in MindMac by using LMStudio?
How to use local LLMs with GPT4All & MindMac?
How to use Ollama with MindMac using LiteLLM?