Hey everyone!
I’m happy to share Promptfire, a plugin I built to solve a problem that’s been bugging me for a while: every time I work with an AI assistant on something inside my vault, I end up re-explaining my folder structure, naming conventions, frontmatter setup, and tag system from scratch. It’s tedious, eats tokens, and the AI still gets things wrong half the time.
Promptfire fixes that. It lets you define your vault’s conventions in a dedicated folder, then copies everything — structure, rules, and selected content — to clipboard with a single hotkey. The output automatically adapts to different LLMs (Claude, GPT-4, Gemini, etc.) with per-model formatting.
What it does
You run a command, Promptfire assembles your vault context, and you paste it into whatever AI you’re using. The AI then knows how your vault works — no more correcting link styles, tag formats, or folder hierarchies mid-conversation.
Key Features
-
Multi-LLM output targets — configure token limits, output formats (XML, Markdown, Plain), and truncation strategies per model
-
Export profiles — one-click multi-format export, e.g. XML for Claude + Markdown for ChatGPT
-
Smart context detection — auto-discovers related notes via links, tags, folder proximity, and shared properties
-
Prompt templates — reusable templates with placeholders and conditionals
-
Additional context sources — pull in freetext, external files, or even shell command output
-
Frontmatter presets — configure context per-note using
ai-contextYAML in frontmatter -
Context diff — export only what changed since the last time
-
Snapshots — save and replay context recipes
-
Granular section selection — pick only the headings you actually need
-
Context history — diff, search, and one-click restore of previous exports
Quick Start
-
Ctrl+P→ “Promptfire: Generate context files” — sets up your convention files -
Settings → Promptfire → Output Targets → “Add Built-in Targets” — adds presets for Claude, ChatGPT, etc.
-
Ctrl+P→ “Promptfire: Copy context to clipboard” — done. Paste into your AI.
Why I built this
I use Obsidian as my main knowledge base and I frequently work with LLMs for writing, coding, and organizing notes. But the constant context-setting at the start of every conversation felt like a waste. I wanted something that would just package what the AI needs to know about my vault and let me get straight to work.
Links
-
Documentation: Full docs are included in the repo under
/docs -
License: 0BSD (do whatever you want with it)
I’d love to hear your feedback — especially from people who already use LLMs with their vaults. What conventions do you find yourself repeating most often? Are there export formats or context sources I’m missing?
Feel free to open an issue on GitHub or reply here. Thanks for checking it out!