# LLM Shortcut Plugin
`LLM Shortcut` maps your prompt library folder to command palette entries, then runs the selected prompt against the active note using any OpenAI-compatible provider.
## Why this plugin
If you keep reusing prompts (“improve writing”, “translate”, “make this a bullet-list”), copy-paste gets tedious quickly.
This plugin lets you:
- keep prompts as plain `.md` files in your vault;
- organize them in folders;
- run them like native Obsidian commands.

> I used OpenRouter’s `google/gemini-3-flash-preview` for demo
## Prerequisites
You have to use your own LLM provider and keys ![]()
The keys you use always stay private to your machine
## Features
- Use your own OpenAI-compatible providers (OpenAI, OpenRouter, and others with compatible endpoints)
- Prompt files become commands automatically (including nested folders)
- Streaming output directly into the editor selection/cursor
- Custom prompt command for one-off prompts without creating a file
- Local-first behavior: your prompt files stay in your vault
- There are a couple more advanced features (like info-mode, that will allow streaming response to a separate modal window, enabling question-like use-cases, and there are more, see the “Advanced Prompt Features” section)
You can review the plugin the plugin at GitHub it’s MIT-licensed and open-source. We’d love to see your feedback or bug reports via issues. Thanks!
