We prefer simple explanations.
There is a good reason for that. The amount of information we encounter is way more than we have the resources to process. So, we rely on simplified frameworks to understand the world.
“We found consistent evidence for computations we refer to as “simplicity”, where attention is deployed to as few dimensions of information as possible during learning, and “competition”, where dimensions compete for selective attention via lateral inhibition.” (Galdo et al., 2022)
It’s a pretty good system. Occam’s razor says that when considering multiple explanations for the same phenomenon, the one which requires the fewest assumptions is most likely to be true (Duignan).
Simple explanations do not sufficiently represent complex issues.
It seems there are countless divisive issues today. Abortion rights, Israel-Hamas conflict, transwomen in sports, immigration rates, and climate change to name a few. Regardless of the confidence you have in your personal stance on these issues, they are highly debated.
In talking to people who hold opposing views on some of these issues, I often find that each side relies on a different subset of information, which allows them to simplify the issue and take a dogmatic stance.
Pro-choice advocates for the health of the mother, while criticizing the choices of the father. Pro-life advocates for the health of the baby, while criticizing the choices of the mother.
I am sure that both would agree that the health of both the mother and the baby are important. No woman would take the decision to abort lightly and no human would ever take the side of a rapist.
Discussion is impossible when both sides take a simplified view.
Personalized algorithms strengthen simplified stances.
We know the algorithm shapes itself to our preferences, but it goes both ways, shaping our preferences by curating the content. By a technique known as ‘algorithm nudging’, our natural biases are exploited to further the goal of the algorithm (Schmauder et al. 2023). In the context of short-form media, that goal is to keep us on the app.
“Nudge is a popular public policy tool that harnesses well-known biases in human judgement to subtly guide people’s decisions, often to improve their choices or to achieve some socially desirable outcome.” (Schmauder et al. 2023)
“Facebook’s feed exemplifies an algorithmic nudge as Facebook’s algorithms empower these feeds to function as choice architects for the users.” (Ahmad and Chaunhan, 2023)
There are a few well-understood human biases, referred to as “cognitive illusions”. These predictable behaviour patterns are exploited by media algorithms to capture and maintain users’ attention.
“They are illusions of perception, judgement, or memory that distort our understanding of reality and sometimes make our thinking deviate from the normative principles of logic and also prudence. Importantly, the differences between reality and how we construe it are often systematic and, therefore, predictable. These “cognitive illusions” often occur involuntarily and are difficult to avoid.” (Schmauder et al. 2023)
It follows that shortform media algorithms would use nudging to increase the simplicity of views to increase users’ predictability. Add to that the activation of the amygdala which gives emotional significance to the views portrayed, and you have the perfect storm for the dogmatic views we see today.
Shortform media is a toxic environment to discuss social issues.
On August 1, 2023, Meta blocked news content from Facebook and Instagram platforms in Canada (Mar). As always, there was backlash, criticizing Meta for denying Canadians access to verified news platforms (Mar). However, they were not denied access, Canadians only have to seek news from news platforms directly. Is that such a bad thing?
Consider what we know about how information is propagated through shortform media. Simplification of complex issues and emotional-charging of opinions by directing the brain towards emotion-forward processing will inherently skew the facts. Meta was right. Shortform media is the wrong platform for news.
How do we embrace the complexity of relevant issues?
With open-mindedness and humility.
When was the last time you admitted you were wrong?
Do you seek out evidence against your own beliefs?
When did you last change your mind?
How do you react when someone challenges your opinion?
References
Ahmad, Norita, and Preeti Chaunhan. “Algorithmic Nudge: An Approach to Designing HumanCentered Generative Artificial Intelligence.” COMPUTER, vol. 56, no. 8, 2023. ieeexplore, https://ieeexplore.ieee.org/document/10206062.
Duignan, Brian. “Occam’s razor | Origin, Examples, & Facts.” Britannica, https://www.britannica.com/topic/Occams-razor. Accessed 19 March 2026.
Galdo, Matthew, et al. “The Quest for Simplicity in Human Learning: Identifying the Constraints on Attention.” Cogn Psychol, vol. 138, no. 101508, 2022, https://pmc.ncbi.nlm.nih.gov/articles/PMC10324982/.
Mar, Leon. “Statement on the blocking of news content on Facebook and Instagram in Canada.” CBC Radio-Canada, 1 August 2023, https://cbc.radio-canada.ca/en/media-centre/blocking-of-news-content-on-facebook-and-instagram-canada. Accessed 19 March 2026.
Schmauder, Christian, et al. “Algorithmic Nudging: The Need for an Interdisciplinary Oversight.” Springer Nature, vol. 42, 2023, pp. 799-807, https://link.springer.com/article/10.1007/s11245-023-09907-4.