â“‚ī¸
MindMac Docs
Go to appBlogRoadmapRelease NotesFeature Request
  • 🌏Product Tour
    • â„šī¸What is MindMac?
    • ✨Getting started
    • 🧩Features
  • 👩‍đŸĢHow to...
    • 🚤Activate license
    • 🚲Deactivate license
    • 🔐Add API key
      • 🔑Create OpenAI API Key
      • 🔑Create Azure OpenAI API Key
      • 🔑Create Google AI API key
      • 🔑Create Google Cloud Vertex AI API Key
      • 🔑Create OpenRouter API Key
      • 🔑Create Mistral AI API Key
      • 🔑Create Perplexity API Key
      • 🔑Create Replicate API Key
      • 🔑Create Anyscale API key
    • đŸĻ™Add Ollama endpoint
    • 🌐Internet Browsing
      • 🍒Get Google Search Key
      • 🍊Get Tavily Key
      • 🍇Get You.com Search Key
      • đŸĨGet Serper Key
      • 🍍Get SerpApi Key
    • đŸ–Ĩī¸Use local LLM in MindMac
    • đŸ…°ī¸Change App Icon
    • 🧠Manage Models
  • FAQs
    • 👩‍đŸ’ģLicense code does not match product
    • 🐞Cannot open the app
    • đŸ’ĨInvalid URL or Key issue
    • ❓Common questions
Powered by GitBook
On this page
  • Prerequisite
  • Add endpoint
  • Customize Context Length (Optional)
  1. How to...

Add Ollama endpoint

PreviousCreate Anyscale API keyNextInternet Browsing

Last updated 1 year ago

Prerequisite

  • Ollama version: 0.1.19 or later

  • Running at least one model under Ollama

Add endpoint

Go to MindMac -> Settings... -> Account or press ⌘ + , to open Account Setting.

  • Select Ollama as provider

  • Enter a name

  • Enter a random text in API Key text area

  • Click Save to finish


Customize Context Length (Optional)

Currently Ollama does not provide exact context length for each model, so if you want to control max tokens (or context length) parameter, you might need to enter that value manually for every Ollama model. Follow below steps to do so.

Go to MindMac -> Settings... -> Account or press ⌘ + , to open Account Setting.

Click on the brain icon to show Ollama model list.

Click on the edit button.

Enter max tokens value for that model and click Save to finish.

If you don't change any Ollama configuration, just use as URL

To get max tokens value for each model, you can find in configuration file (config.json for example) of that model on Hugging Face. For instance, with the model , you can go to and grab the value max_position_embeddings as its context length. Kindly be aware that there is no standard specifying which key determines the context length of a model in the model configuration file.

👩‍đŸĢ
đŸĻ™
http://localhost:11434/api/chat
Mixtral-8x7B-Instruct-v0.1
https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/blob/main/config.json