Are plugin download stats actually misleading?

Okay, this might be forum suicide, but I need to get this off my chest and see if I’m crazy or if others have noticed this too.

TL;DR: I think our download stats fundamentally misrepresent plugin popularity, and I’m curious if anyone else sees this as a problem.

Here’s what triggered this thought

I was looking at my own plugin stats recently (Auto Keyword Linker - only been around for 3 months), and something felt… off. The public download count shows 4,113, which sounds decent. But when I dug into the actual numbers:

  • Latest version (3.0.5): 417 downloads
  • Version from a few weeks ago (3.0.1): 1,186 downloads
  • Older versions: scattered hundreds across the remaining releases

So that 4,113 is just… everything added up since I launched. Forever. It only goes up.

Why I think this is a problem

For users: You can’t tell if a plugin is actively used right now vs was popular in 2022 and abandoned. A plugin showing 5,000 downloads might have 10 installs on its current version, but you’d never know.

For new plugins: We’re competing against cumulative totals from plugins that have been around for years. A plugin with 200 downloads on a release from last week is arguably healthier than one with 3,000 cumulative downloads but hasn’t been updated in 18 months.

For developers: I genuinely can’t tell if my plugin is growing or stagnating. Did my latest release land well? Are people upgrading? The aggregate number just… increases.

What I think would be better

Show downloads for the latest release as the primary metric. Keep the all-time total if you want, but as secondary info.

Like:

Latest release: 417 downloads (v3.0.5)
All-time: 4,113 downloads

Or even better: downloads in the last 30/90 days.

But maybe I’m wrong?

I’ll fully admit: my plugin is new (3 months old), so maybe I’m just salty that established plugins have bigger numbers than me. That’s entirely possible and I’m open to that criticism.

I’m also aware this could hurt some developers who’ve put in years of work building up those totals, and I don’t want to dismiss that.

But… doesn’t the current system make it really hard to know what’s actually popular and maintained today? Every other ecosystem I can think of (npm, Chrome Web Store, VS Code) shows recent/current usage, not lifetime cumulative installs.

Genuinely curious

Am I off-base here? Is there something I’m missing about why cumulative is better? Or do others see the same issues?

I’m posting this partly to gauge reaction before I consider suggesting anything formally. If everyone thinks I’m being ridiculous, I’ll know to drop it. But if this resonates with others, maybe it’s worth discussing?

(Full transparency: Yes, my plugin would show smaller numbers under this system. I’m still arguing for it because I think it would give users better information. But I acknowledge my bias here.)

2 Likes

Yes, the download number is an imperfect measure of popularity. You might think of it as a combined measure of popularity and longevity (and release frequency). I’m not sure how much showing downloads for only the latest release would help. Every time a plugin updates, its download count would drop to 0. A popular but unmaintained plugin would keep growing its count. Counting the release before the current one still wouldn’t reflect the number of people who abandon an unmaintained plugin. In that case the count won’t grow but it also won’t go down. But maybe counting this way would reduce the still improve things.

Another factor is that plugins don’t automatically update in the app, and until recently users didn’t know there were new versions unless they checked manually.

But users have other signals of how maintained a plugin is, too, like the release date of its current version and whether its GitHub has a bunch of unaddressed issues.

I mostly pay attention to numbers at the lower end, as a signal that the plugin. Past some point, exactly how big it is doesn’t matter to me.

I removed "Controversial take: " from the title because it’s kind of clickbait-y (and it’s not controversial that the download count is a rough tool).

1 Like

I have been curious about the numbers, so thanks for explaining this.
If you turn this into a feature request I’d add my +1

1 Like

You make really good points, and I think you’ve identified exactly why I’m conflicted about this.

The “drop to zero on update” issue is a killer argument I hadn’t fully thought through. You’re right that a plugin updating frequently would constantly reset, which penalises good maintenance behaviour.

And yeah, the automatic update point is huge. I honestly hadn’t considered that users might still be on old versions, not because the plugin is abandoned, but because they haven’t manually updated. However, I see on the release that updates are now checked every three days. But this still needs to be toggled on by the user, yes?

  • New setting: Community Plugins › Automatically check for plugin updates. Obsidian will check for plugin updates in the background every 3 days, or after the app updates.

That completely changes how I should interpret my own stats (those 1,186 downloads on v3.0.1 might just be people who haven’t checked for updates yet, not people avoiding v3.0.5).

I think what I’m actually frustrated with is the lack of a recency signal in the displayed number. As you said, popularity + longevity + release frequency all smooshed together into one number.

For established plugins with years of history, that cumulative total becomes less and less meaningful as a signal of current health.

Maybe the real solution isn’t changing what we count, but adding complementary signals? Something like:

  • Keep the cumulative total (it does show legitimacy/track record)

  • Add “downloads in last 90 days” or similar as secondary info

  • Surface the release date more prominently (as you mentioned - this is already there, its just not on the main list of plugins)

That way, you get both the “this plugin has been trusted by thousands” signal AND the “people are actively using this now” signal, without the reset-to-zero problem.

I appreciate the title edit too - you’re right that it was unnecessarily clickbait-y. This is genuinely just me trying to understand whether there’s a better way to think about these metrics, not trying to stir up drama.

Thanks for the thoughtful response - it’s helped me realise my frustration is more about missing context than the actual number being wrong.

Done and done - FR created.