Custom LLM connection with Text Generator plugin

:white_check_mark: Troubleshooting Custom LLM Connection with Obsidian’s AI Plugin (LM Studio + MythoMax)

Hi everyone,

I’m reaching out after several days of trying to get a custom local LLM integration working with Obsidian’s AI Plugin. My end goal is to use a local model (MythoMax 13B, running in LM Studio) as a conversational assistant within Obsidian, using the plugin’s custom endpoint support.


:wrench: My Setup:

  • Model: mythomax-13b (running locally in LM Studio, latest version)
  • Endpoint: http://127.0.0.1:1234/v1/chat/completions
  • Obsidian Plugin: Obsidian AI Plugin (installed and toggled on)
  • Plugin Config:
    • Endpoint set to Custom
    • Model name copied exactly
    • No API key required (left blank)
    • Tried multiple JSON body/header formats

:test_tube: Symptoms:

  • Occasionally I get TG error.I or JSON5 formatting errors, even with seemingly valid body JSON
  • Other times, no error is thrown—but I receive no response from the model (just spinning dots)
  • I’ve tried:
    • Resetting plugin config
    • Restarting Obsidian and LM Studio
    • Tweaking body/header values in different combinations

:red_question_mark: My Questions:

  1. Has anyone successfully connected LM Studio (or a similar local LLM) to the Obsidian AI Plugin?
  2. Is there a known working body format or plugin configuration that works reliably?
  3. Is there something I’m missing—perhaps in Advanced Mode, headers, or model registration?

Any insights, working examples, screenshots, or help would be greatly appreciated. This plugin is a big part of my creative workflow, so I’d love to get it working with my local model.

Thanks in advance!
– Glenn

There is no plugin named “AI” (the word “Obsidian” is removed from Obsidian plugin names). What is the actual name of the plugin you’re using?

Text Generator

I updated the post title to use the plugin’s name.

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.