Basically, the title pretty much sums up my problem. For a more extended version…
What I’m trying to do
I’m trying to “connect” LLMs, running locally with LM Studio, to Obsidian, to assist me with note-taking.
Things I have tried
All official and unofficial instructions. Although they’re clear when using OpenAI’s options, they get somewhat convoluted when describing how to set up a local LLM with LM Studio.
- I’ve installed LM studio, and everything works as it should.
- I’ve been using Obsidian for ages, and it works as it should.
- I’ve installed both plugins (text generator and BMO Chat).
- I’ve enabled LM Studio’s server feature, after ensuring the CORS option was switched to ON. It seems to activate correctly, and its log gives me three URLs: http://localhost:1234/v1/models, http://localhost:1234/v1/chat/completions, and http://localhost:1234/v1/completions.
- I’ve enabled one or the other, went through its options, switched the LLM Provider to Custom, and made sure to add one of those URLs in each plugin’s REST API URL or Endpoint or Whatever textbox. However, since at least one of those plugins states that it will append the default “/chat/completions” part, I’ve also tried using http://localhost:1234/v1/ and http://localhost:1234/v1.
- I’ve checked my firewall to make sure it’s not blocking communications between Obsidian and LM Studio. I don’t believe it does, since I didn’t see anything related to Obsidian or LM Studio in its log (it was actually empty when trying to set this up).
Despite all that… No go. I see others happily talking about how easy and awesome it’s working for them, but for me, each of those plugins throws an error related to RestAPI communications, “failing to fetch” (…I suppose “a response”), etc.
So… Help, anyone? I’d appreciate it if anyone who’s set up Obsidian with LM Studio successfully could lend a helping hand.