# Use local LLM in MindMac

You can use local LLMs with MindMac via [LMStudio](https://lmstudio.ai/), [GPT4All](https://gpt4all.io/index.html) or [Ollama](https://ollama.ai/)+[~~LiteLLM~~](https://github.com/BerriAI/litellm). Please check below videos for more details.

* [How to use local LLMs in MindMac by using LMStudio?](https://www.youtube.com/watch?v=3KcVp5QQ1Ak)
* [How to use local LLMs with GPT4All & MindMac?](https://www.youtube.com/watch?v=4a6cIzDjh30)
* [H~~ow to use Ollama with MindMac using LiteLLM?~~](https://www.youtube.com/watch?v=bZfV70YMuH0\&t=1s)

### LMStudio

{% embed url="<https://www.youtube.com/watch?v=3KcVp5QQ1Ak>" %}

### GPT4All

{% embed url="<https://www.youtube.com/watch?v=4a6cIzDjh30>" %}

### Ollama+~~LiteLLM~~

{% embed url="<https://www.youtube.com/watch?t=1s&v=bZfV70YMuH0>" %}

Now you can use Ollama without LiteLLM. Please visit [add-ollama-endpoint](https://docs.mindmac.app/how-to.../add-ollama-endpoint "mention").
