Mixed reality Graph View integration

I still like printing out some ideas and pasting them on good old index cards and sorting them in arrangements then snapping pictures to later return to Obsidian and link/organize new ideas.

I even have a technique to tag and link cards just like in Obsidian. A camera based plugin allowing for this process to be automated in real time would be magical.

I am considering how the plugin might understand links, transclusions, tags, images, etc.

However, I am not so sure of the feasibility of the recognizing notes and matching the arrangement of notes in graph view in real time. I’m also not sure this could be accomplished with links within notes.

VR, unless done flawlessly, doesn’t seem too attractive in this realm but I can imagine some wild AR implementations. To me, a big part is being able to do this traditionally in a quasi disconnected state.

I have a feeling it is already being developed, and if not, I hope it will be now, because I would love it.

Add speech recognition, which I usually avoid, but would temporarily enable it to use this technology.

Thank you for your consideration.


This idea sounds like it comes from another plane of existence :smiley: very interesting!