I discovered a really nice plugin that shows semantically similar notes and I’d like to share it here. It also has a semantic search command. It was not created by me, but I’m very satisfied with it and I think it’s worth having a look if you’re searching for something similar to Smart Connections.
Yeah, I know, everyone tries to shove AI into everything, even if not needed, and the whole AI thing is starting to generate mixed feelings, I’ll take my chances to say it’s not quite what you think with this plugin (hope that I’m not wrong or speculating in the wrong direction).
This one just uses a local model that creates embeddings of your notes into a vector space (oversimplified, it converts words in numbers and puts them in vectors) and uses a similarity search function with those vectors to find similar notes. Everything is run locally and saved in a local DB, nothing exits if you don’t want to, that’s why it attracted my attention as it’s one of the reasons I’m using Obsidian in the first place, to not have my stuff being passed through god-knows-what cloud services. That’s my understanding from the plugin description and what I could look into the code, if I’m wrong then mea culpa.
Overall, the whole create text embeddings and used them to find similar stuff is not something new and this thing was (and is) used in natural language processing and data science even before LLMs were a thing. He used the correct term as this does technically enter under the AI term umbrella, as LLMs, ML and other things.
The local model that it uses:
The author did implemented the option to use another embedding provider, like Claude or Gemini, but again, there’s the local model that, from what I have seem with my notes, work pretty well.
If you’re curious what the whole word embedding thing does:
If you’re not interested at all in anything stamped AI, even if not related to the whole LLM frenzy we have now in the world, then it’s understandable that you want to skip this. I just wanted to offer some details that might help understand how it works and the utility of this plugin.
Thanks for clarifying.
I THINK like the idea of local LLM.
I’m trying to deploy one for Ambient Scribe but I haven’t got there yet.
And my Docker box has no GPU !