What if i disagree

What do you do when you come across a claim you disagree / is sceptical about. Do you still convert it into an atomic note or are they exclusively things you agree with.

Disagreeing is even more important than agreeing because it helps define differences.
So quote the note, and write your opinion, with reasons.


Makes sense; I guess otherwise you will create an echo chamber of sorts. But what if you are exploring a new topic and find two conflicting arguments, do you create a note for both these arguments?

Each argument will be an atomic note (maybe more than one).
Your evaluations and opinions will be separate atomic notes.

As you concatenate, they become less atomic.

The real value of this structure comes in the future when you see more arguments and facts and have more opinions of your own.


I try to ensure that both the opinions I agree and disagree with have similar keywords or tags to make it easier to surface when researching such topics in the future.

1 Like

When I come across an idea I disagree with, usually because of a deep understanding of the subject, I normally don’t add it at all.

I am only taking in ideas which I believe are useful to me.

It’s a different story when you take in a new idea which conflicts with an old idea, usually either because you’ve forgotten about the old idea or you’re more convinced by the new idea.

Then you would want to do some serious reassessment over which idea is the correct interpretation.

Darwin’s Golden Rule of thinking:

I had, also, during many years, followed a golden rule, namely, that whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from memory than favorable ones.

If you come across a claim you disagree with/are skeptical of, and you ignore it then you are embracing confirmation bias.

Make a note and record your thoughts about it and link it to your topic so whenever you revisit you will be linked to alternate hypothesis.


So, do you use tags to distinguish between atomic notes created from others and the ones you came up with.

For example if researching on censorship vs freedom of speech, there are multiple arguments. Accoding to my understanding this is what i would do now. Curious if I can improve the workflow anywhere

  1. Research the topic and create atomic notes on all relevant arguments on both sides
  2. Create an atomic note regarding my conclusions with backlinks to my supporting arguments and conflicinting arguments while giving reasons for each
  3. Later if i changed my mind I can come back to the atomic note created in step 2 and review the validity of the supporitng claims
1 Like

I think you would be interested in the breadcrumbs plugin.
I have similar needs where I have to distinguish between which are the supporting arguments and the refuting arguments. I do this by creating a heirarchy with the labels supports, supported by, refutes etc. in the breadcrumbs plugin.
You might find this thread that I created interesting where I explain how I use the breadcrumds plugin for this purpose - Breadcrumbs plugin: does anyone have a workflow where they visualise chains of notes related by heirarchy in juggl graph
There is also a mega thread that asks for having link types in general - Add support for link types - you may find good ideas here as well.

Are the links created in breadcrumbs plugin viewable in the graph view

Not in obsodian’s native graph view; instead there is a plugin called juggl - there is an option in the breadcrumbs plugin called juggl view, that, provided that the juggl plugin is installed uses juggl to to show the links in graph view… The that is very buggy in my opinion, at present. You might instead prefer to use the juggl plugin alone which has its own way to implement link types. But I prefer breadcrumbs because it automatically creates a connection back from the note that is linked to (if note A is UP of B then B is DOWN of A)
Check out the github wiki page of the breadcrumbs plugin and also the documentation site (juggl.io) of juggl plugin. Things are explained well there.

That seems interesting, I will check it out. Meanwhie this is what i came up with

Notes Types

:seedling:- Add tags, reference and find preliminary relations
:four_leaf_clover:- Need further research to fleshed out
:evergreen_tree: - The Highest order note

Dealing with conflicting ideas

  • If it conflicts with a :evergreen_tree:
    • Create another note refuting it with grains of truth
    • If you cannot refute; demote :evergreen_tree: to :four_leaf_clover:
    • Create a note for the conflicting idea
    • Do more research
  • If it is a new topic
    • Create notes for both arguments
    • Do more research
    • After deciding which argument is the truth; promote it
    • Create a note refuting the conflicting idea

Interesting thread. It has reinforced my interest in the breadcrumbs plugin.

@kannan - building on your summary workflow note, I thought I’d throw in a reflection that occurs to me about the role and nature of “doubt” in our thought process.

Developmental psychologist Robert Kegan studies adult development, and how the brain’s capacity for thought develops in stages. Essentially learning is framed as a process of “complexification” which whilst a horrid word, captures the essence of how we develop the capacity for increasingly sophisticated thought.

He describes that our capacity for thought develops in stages, or in other words there are periods of slower and faster growth… and the slower periods of growth are like a sort of equilibrium or stability, when our thoughts are more ‘fixed’ (in the Carol Dweck sense of fixed versus growth mindset).

During this ‘fixed’ period of mindset, we can think of our frames of analysis/thought as “holding” our thought “subject-to” those frames, which we are unable to see. At some point, the frame itself that we have been thinking with starts to become visible, and the frame itself becomes the “object of” our thought, so that we can manipulate it and are no longer “held by” it. Kegan calls this dynamic his Subject-Object Theory, and it’s the dynamic that underpins all his ideas about how our capacity for thought develops.

The process of moving between fixed mindset stages is characterised by the experience of “doubt”. In this context, doubt relates to our way of knowing. And fundamentally, our way of knowing is the same thing as our way of being. So doubt surfaces in our thoughts whenever we’re in transition.

The relevance here is that doubt is defined as holding some idea as simultaneously valid and invalid. This reinforces the value of the ‘for’ and ‘against’ processing of thought that others have talked about, and that you reflect in your workflow. The only way to “hold” our doubts, is to penetrate the duality that sustains it.

What I think it this suggests, is the need to be self-aware about the quality of experience of thinking. Confusion, I would suggest, is the FEELING that you have doubts. And we all know that sticking with the struggle of confusion is an important, if frustrating and messy, aspect of making progress. When we “disagree” it is often accompanied by emotions, that can even be tied up with our sense of identity - that is, the positions we have adopted on a topic or situation “hold us”. And we can become invested in those positions.

Moreover, neuroscientist Mary Helen Immordino-Yang says that the early stages of learning are guided by emotion, not intellect.

One study she did is known as the Iowa Gambling Task, in which participants play a game involving 4 decks of cards. Unknown to participants, two decks are rigged such that they usually yield good winnings but occasionally yield a big loss.

What the study reveals is that as players reach for cards from the rigged decks, the first signs of learning are subtle feelings or unconscious signs of anxiety. They might sweat a little. They might hesitate as they reach for a card from one of these rigged decks.

They’re not conscious of this anxiety at first, it’s a sense or feeling. Only later, Immordino-Yang explains, do players start to use their cognition to formulate a theory of what’s going on. In other words, initially people “feel their way” and only later do they start to process cognitively and form a theory. Her radical assertion is that ALL LEARNING is navigated emotionally. People who lose the emotional centres of their brains, can no longer USE the knowledge they have. Quite literally, they can explain what they know but they do not BEHAVE in accordance with what they know when faced with real problems: they can’t navigate, because they can’t feel their way.

I realise this is subtle and perhaps a bit abstract, but what she advocates for is that we pay attention to our emotions when we learn. Her contention is that emotional thought is fully 80% of all the brain’s thought, including all of decision-making. In other words, emotions are the brain’s guidance system for navigating everything we do. We follow people we admire, for example: admiration is what she calls an intellectual emotion, along with other subtle emotions like interest and respect.

Going back to Kegan’s work, doubt manifests in thought in our use of language in our self talk. Doubt surfaces as thoughts of “not knowing” or confusion that leads to “questions”.

So this is all to say - in a long-winded way, sorry for that - that we should embrace our doubts, and formulate them into questions.

Hopefully I’m not teaching grandmothers how to suck eggs…


Off topic, I just realized that emojis are a cute way to distinguish between structure notes and other types of notes. I hadn’t thought about it before. Thanks!

Lol ya, while I was looking at different note taking apps, I saw how notions users notes looked so beautiful with emojis, while mine was a soulless wall of text :joy:

This is very insightful. One thought came up while reading this was; what are the tradeoffs of exploring every doubt you have vs trusting yourself. I think the two extremes would be a person A who is never sure about himself and constantly reading on new material hence cant decide which stance to take and person B who is so sure he is right that he never even bothers to look at the other side.

Also I would apprecitate what your workflow is for handling conflicting ideas.

Good question. I think this is really a matter of prioritising the number of lines of enquiry that you pursue, at any one time, as the means of limiting the exploration of doubts. Once you commit to a line of enquiry, you have no real option but to follow the thinking where it takes you.

In terms of processing, a doubt presents us with two options: suppress it, or explore it. If we suppress it, it remains unresolved for now. So the problem is knowing which doubts will trip up your research, and invalidate your end result.

I like Elon Musk’s way of approaching this, which he calls “thinking from first principles” or axiomatic thinking. An axiom is a self-evident, unquestionable truth - the most fundamental principles on which you must proceed, or your solution will fail.

It was explored most clearly in his interview on Lex Fridman’s podcast (a recent interview in the last couple of months; he’s had Musk on the show a few times). Axiomatic thinking is a way to approach a problem or question by sticking to the smallest possible well-defined number of focus areas: or simply put, knowing what you MUST think clearly about.

Musk used the example of building the best rocket engine ever. His team at Space-X had to solve this problem at three levels: designing maximum thrust, making the lightest engine possible (which clearly impacts power to weight ratio and thus thrust) and making it as cheaply as possible. So his production example of axiomatic thinking, was that the minimum cost of production is always the cost of raw materials plus the cost of intellectual property rights. It’s not possible to make something like a rocket engine any cheaper than that, and you seldom reach that minimum cost unless you can reach production volumes that yield economies of scale.

Thinking within these parameters, therefore, if you arrive at a minimum possible design weight of the combustion chamber using available alloys/construction materials, in such a way that it still meets your design axioms (based on laws of physics, ie relating to thermodynamic behaviour of gases), then the only way to make it cheaper or lighter (or both) will be to innovate to find new alloys or materials or new methods of construction. So your ways forward, in pursuing solutions, are very clear when the axioms are clear.

I’ve tested this way of thinking recently on a project I’m working on. It’s not rocket science in my case! But I’ve been working on a set of problem solving tools that are designed to facilitate people facing specific types of problems. The research for the project could have been so open-ended that I’d never finish. But by thinking about the axioms the project should be based on, I was able to contain the research to the key success factors for the project. I started with more “candidate” axioms than I ended up with finally, so I had to do some initial research to decide where to place my focus, based on my desired end product’s features and benefits.

A final thought. If you haven’t read Henri Bergson’s booklet “An Introduction to Metaphysics” I’d thorough recommend it for an exploration of intuition versus analysis. In short he concludes that intuition can inform analysis, but analysis can never lead to intuition. In other words, he contends that the scientific method itself - usually portrayed as analytical and evidence-based “truth” - is itself, in reality, always the product first of intuition: ie the testing of a hypothesis, which is never arrived at via analysis.

By extension, every hypothesis is riddled with doubt. Most hypotheses turn out to be wrong. Therefore, we tend to “imagine” that our best thinking is analytical, when in truth it starts with intuition.

Bergson’s paper on metaphysics was written after all his famous books on things like the duration of time. He referred many times to “intuition” in his earlier writing, but never explained his working method. Until the metaphysics paper, that is, which unpacks his thinking approach to intuition.

He’s probably best known for his writings about time as “duration” - that is, we never experience clock time, since all measurement is by definition backward-looking (ie measurement of past events). If someone asks you what time it is, you can only answer what time it was a moment ago! You could only experience clock time by freezing time at a “point” in time. Therefore, our REAL experience of time is the experience of the “flow of time” or in other words we experience “the present” as as a flow of time through the present. He called this duration.

Intuition, Bergson argued, is a durational phenomenon. Whilst analysis is a reductionist activity, comprising the breaking-down of problems into component parts, just as we might break time into chunks of seconds and minutes that in reality don’t exist other than as duration. Intuition is not fragmentary, but a sort of understanding of the whole. When we have an insight, it comes from intuition. That intuition has access to the whole of our past experience and knowledge.

What I’m suggesting, is that we explore our doubts guided by intuition not analysis. Intuition is how we navigate connections.

Another neuroscientist, Hofstadter refers to the way the brain thinks as “analogical thinking” - ie we think by analogy. Our analogies exist in nested sets of entailed analogies, that ultimately add up to everything we’ve experienced and known in our personal history. Analogical thinking is based on likeness, which itself is recognised in perception as familiarity. Familiarity, if you reflect on your own experience, is an intuitive sense of knowing (cognitive scientists might use the term “schema” as in your intuitive grasp of “up” or “down” can apply to anything, irrespective of subject matter).

All this is to say, that resolving doubts is experienced as “not knowing” or having no clear “analogy”. Analogical thinking is some of the fastest, least conscious thinking we do… the basis of intuition, possibly? Anyway, how far you take your doubts is really a matter of how certain you need to feel about what you know. In Musk’s rocket case, lives are at stake, so certainty needs to be as high as is possible. And even then, doubts will remain due to the complexity of the projects and impossibility of testing for all scenarios and conditions.

In summary, the formulation of the question or problem drives the lines of enquiry. The doubts then demand to be pursued until they are all satisfied, or judged inconsequential.


A final thought, as I didn’t address your scenario of two people A and B… (I got carried away, as I do…)

Person A (never sure of himself) needs to reflect on whether he doesn’t know enough, or lacks self esteem. Both might result of feelings of insecurity or hedging behaviour when discussing their ideas with others. Introverts might experience another subtly different problem: they like to think reflectively, and process internally, their thoughts before they feel comfortable expressing them. Women, research shows, also tend to doubt themselves more in professional settings, unless coached and mentored, or managed with sensitivity by (mostly male) management.

So we have to distinguish between our behaviour, and our thought. It’s not always easy to do so, since we can become gripped by our emotions.

In other words, self awareness is required to locate the cause of doubts. Doubts that are caused by lack of knowledge are the genuine type of ‘not knowing’. Doubts caused by fear of looking stupid, fear of making mistakes or other anxieties are not doubts about our “thoughts” so much as doubts about the “outcome” of a situation. Often caused by past bad experiences in social situations, these doubts would be present regardless of the state of knowledge.

Over-confidence in one’s knowledge, dare I say it, gets us into Dunning-Kruger territory where smarter people realise they know less, the more they know… and less smart people tend to become overconfident… as verywellmind.com puts it:

Dunning and Kruger suggest that this phenomenon stems from what they refer to as a “dual burden.” People are not only incompetent; their incompetence robs them of the mental ability to realize just how inept they are . Incompetent people tend to: Overestimate their own skill levels.

Of course that’s the Person B extreme. Even if we’re not operating at the limit of our capacity to learn and think, we can become over-confident in what we know. Familiarity can breed complacency, thinking that we recognise a situation or problem for what it is when in reality our conclusions are invalid due to inattention to details or plain lazy assumption.

For expert learners or people with well-trained minds (ie well educated, whether traditionally ‘schooled’ or self-taught), I tend to think it’s safe to go with intuition and follow those doubts.

1 Like

Never frame the title of a note as a negative. Instead, provide the positive version of it as a conflicting opinion. For example: “Technology leads to dehumanisation.”

Instead of creating a note “Technology does not lead to dehumanisation”, you should provide an alternative viewpoint that contradicts it, such as “Technology allows connection between humans regardless of geography” or something like that.

You can then create a synthesis note as an overarching research question, or what some might call a MOC: “Technology and dehumanisation”. Then dump all your ideas within this workbench note, link to these two notes above, discuss how they conflict, categorise them, etc.


I have been looking into the MOC idea for a while; can’t figure out how to do it properly thought.

But I like your idea though. For example if the quesiton is “should meat be banned” there is a lot of nuances and good arguments of both sides; So I guess I would create a MOC with links to different arguments.

Are there any resources other than Nick Milo’s paid course for creating MOC, I am a student and can’t afford it.

1 Like