Troubleshooting Custom LLM Connection with Obsidian’s AI Plugin (LM Studio + MythoMax)
Hi everyone,
I’m reaching out after several days of trying to get a custom local LLM integration working with Obsidian’s AI Plugin. My end goal is to use a local model (MythoMax 13B, running in LM Studio) as a conversational assistant within Obsidian, using the plugin’s custom endpoint support.
My Setup:
- Model: mythomax-13b (running locally in LM Studio, latest version)
- Endpoint:
http://127.0.0.1:1234/v1/chat/completions - Obsidian Plugin: Obsidian AI Plugin (installed and toggled on)
- Plugin Config:
- Endpoint set to Custom
- Model name copied exactly
- No API key required (left blank)
- Tried multiple JSON body/header formats
Symptoms:
- Occasionally I get TG error.I or JSON5 formatting errors, even with seemingly valid body JSON
- Other times, no error is thrown—but I receive no response from the model (just spinning dots)
- I’ve tried:
- Resetting plugin config
- Restarting Obsidian and LM Studio
- Tweaking body/header values in different combinations
My Questions:
- Has anyone successfully connected LM Studio (or a similar local LLM) to the Obsidian AI Plugin?
- Is there a known working body format or plugin configuration that works reliably?
- Is there something I’m missing—perhaps in Advanced Mode, headers, or model registration?
Any insights, working examples, screenshots, or help would be greatly appreciated. This plugin is a big part of my creative workflow, so I’d love to get it working with my local model.
Thanks in advance!
– Glenn