Digital boosts
Boosting citizens' competences to deal with the online world

Online environments are crucial to most aspects of life—yet the digital world is replete with smart, highly adaptive choice architectures that are designed to maximize commercial interests. Online landscapes geared toward capturing users’ attention, monetizing user data, and predicting and influencing future behavior put society at risk by reducing human autonomy, heightening incivility online, and facilitating political extremism and the spread of disinformation. There are two boosting approaches to tackle these challenges.
Boosting cognitive competences in online environments
One way to address these challenges is through a behavioral/cognitive approach that empowers users and fosters digital competences.
Types of digital boosts targeting cognitive competences: Overview
Based on: Kozyreva, A., Lewandowsky, S., & Hertwig, R. (2020). Citizens versus the internet: Confronting digital challenges with cognitive tools. Psychological Science in the Public Interest, 21, 103–156. https://doi.org/10.1177/1529100620946707
Lateral reading and simple fact-checking rules
Lateral reading is a simple heuristic for online fact-checking: When the source is unfamiliar, leave the page and verify the author/organization and their claims somewhere else (e.g., using search engines or Wikipedia). Learn more.
For simple fact-checking rules laid out in a fast-and-frugal decision tree, see here.
Inoculation
Inoculation is a preemptive intervention that boosts people’s cognitive resistance to misinformation and online manipulation by exposing them to a weakened form of disinformation and/or common strategies used to manipulate people’s beliefs (see these examples of inoculation boosts).
Self-nudging
Self-nudging refers to self-imposed interventions in one’s digital choice architectures. The goal of changing one’s environment is to enhance self-governance and to lower distractions (see these examples of self-nudges for the online world).
🙈 Critical ignoring
Critical ignoring is a type of deliberate ignorance. It involves controlling one’s information environment by filtering and blocking out information (e.g., emails, news feeds, instant messages) in order to reduce exposure to false and low-quality information.
Examples of digital boosts targeting cognitive competences
On July 2nd 2022 the Max Planck Institute for Human Development in Berlin participated in the Long Night of the Sciences 2022. We, the Science of Boosting group, organized an interactive “digital boosting toolbox” that helps people to deal with misinformation. Even if you were not at the actual event, you can still check out the digital boosting toolbox here on the website.
Boosting via digital environments
Another way to address these challenges is by taking an environmental approach: designing digital architectures that encourage people to exercise existing competences or learn and apply new ones. This includes enabling people to do their own fact-checking, to promote truth, and to interact constructively in democratic discourse online.
Types of boosts via the digital environment: Overview
Based on: Lorenz-Spreen, P., Lewandowsky, S., Sunstein, C. R., & Hertwig, R. (2020). How behavioural sciences can promote truth, autonomy and democratic discourse online. Nature Human Behavior, 4, 1102–1109. https://doi.org/10.1038/s41562-020-0889-7
Leveraging epistemic cues
Boosts using epistemic cues in the environment are aimed at getting people into the routine of checking the quality of information they see online. These boosts can take the form of highlights on sources and cross-links (e.g., as found on Wikipedia) or pop-ups showing fast-and-frugal decision trees.
Algorithmic transparency
These boosts use features of the online environment to help people understand algorithmic decisions. For example, allowing people to customize their own news feed can help them understand and exert control over the algorithm that selects what they are shown.
Social network literacy
Representations of information flows and opinion distributions can help people develop an intuition for how information is shared online and how discourse on social networks unfolds. Such representation can include representations of clusters of opinions on a two dimensional plane (as it is done on the platform pol.is) or a transparent visualization of the sharing history of content.