Preserving attention in the digital age
An interview with Mehdi Khamassi, ISIR researcher
In the book Pour une nouvelle culture de l'attention. Que faire de ces réseaux sociaux qui nous épuisent? co-authored with Stefana Broadbent, Florian Forestier and Célia Zolynski, researcher Mehdi Khamassi, a specialist in cognitive science and AI applied to robotics at ISIR*, shares solutions for regaining control of our attention in the digital age and promoting a more beneficial and democratic use of new technologies.
How did the idea for this book come about?
Mehdi Khamassi: I'm part of the project launched in 2019 by philosopher Daniel Andler, "Emerging technologies and collective wisdom" at the Académie des sciences morales et politiques. This project brings together scientists from various disciplines to reflect on societal problems. The issue of the attention economy quickly became central. We co-organized brainstorming days at the French National Assembly and Unesco, bringing together specialists, MPs, Unesco representatives, think-tank members and others. During these days, we accumulated many results that we decided to compile in a book to share with the general public.
In your book, you talk about the attention economy. How would you define it?
M. K.: Some economists and sociologists believe that in today's society, people's attention is as precious as oil in the industrial economy. It has become a central currency. The attention economy is therefore an economic model that consists in capturing our attention and keeping it engaged in order, among other things, to increase the price of advertising space.
Did this phenomenon already exist before the digital era?
M. K.: Before digital tools, there were already psychologically inspired techniques used to capture and maintain attention for brands and products. But with digital technology and the development of cognitive knowledge, this capture has intensified, with digital tools enabling us to test in real time what captures our attention best, and to gather information about our behavior: How long do we stay on a video? What suggestions do we click on? Which suggestions do we click on? This enables us to better estimate our preferences and better target advertising suggestions.
From a cognitive point of view, what impact does it have on our brains?
M. K.: In my research, I'm interested in decision-making, and in particular in the shift from deliberative, voluntary, goal-oriented decision-making to automatic, stimuli-reactive decision-making. The psychological mechanisms used by social network interfaces keep us in automatic patterns of behavior, reducing our freedom of decision. We think less about what we're doing, automatically unwinding an endless thread, chaining videos together or clicking as soon as there's a notification. This stimulus-response mode of operation means we lose our freedom. We let ourselves be passively guided, without taking the time to reflect: Am I satisfied with my use of social networks? Have I done what I wanted to do? Is it time to stop, or do I deliberately want to stay connected? If so, for how long?
Social novelties and feedback, such as likes and notifications, activate our reward system, notably via dopamine, a neuromodulator that reinforces behaviors associated with pleasure. This incessant quest for novelty and social validation makes us feel like we're missing out if we're not constantly connected, which reinforces our urge to interact with these platforms, to react to every notification.
In your book, you also discuss the risks that these attention-grabbing mechanisms can pose to democracy.
M. K.: Indeed, social network recommendation algorithms favor shocking content, which attracts more attention than neutral or scientific content. This leads users to be exposed to emotionally-charged or confrontational content, to clash-type interactions rather than fostering mutual understanding. This situation is dangerous for democracy, as it locks users into bubbles of polarizing content, limiting constructive exchanges and living together.
Beyond the question of capturing attention, digital tools lead to other cognitive problems, don't they?
M. K.: Absolutely. It's been shown that the proliferation of screens reduces our capacity for memorization and comprehension. For example, when we watch a series while browsing social networks or revising for an exam by sending messages, our performance suffers. This illusion of being able to multitask reduces our performance.
What's more, spending too much time in front of screens deprives us of other activities that are essential for our cognitive development and open-mindedness, as well as resting our brains, such as going for a walk, reading a book, chatting face-to-face, and so on. Visual over-stimulation from screens can also affect our sleep, with blue light disrupting our internal clock.
The use of these technologies has an even greater impact on children, doesn't it?
M. K.: Yes, and protecting children is particularly important because their brains are still developing. Up to the age of three, we recommend not exposing them to screens, and beyond that, strictly regulating their use, for example by limiting it to one hour a day. Adolescents, whose prefrontal cortex is not fully developed until the age of 25, are also more vulnerable to addictive mechanisms, such as likes or notifications, which can lead to compulsive behavior. Some studies also show that social networking can have a negative impact on self-esteem, leading to feelings of malaise, particularly among teenagers, who are often going through a period of emotional instability and high social need.
What solutions do you propose to help us regain control of our attention?
M. K.: To regain control of our attention and create a freer digital space, several approaches are needed. First, on an individual level, it's essential to educate users to better understand cognitive biases and how their attention works, while encouraging them to have greater control over their digital interactions. In terms of education, it's crucial to teach children from an early age not only to use digital interfaces, but also to protect their personal data.
In terms of legal regulation, we propose banning manipulative practices such as "dark patterns", where interface design steers users' choices against their interests. Our co-author, Célia Zolynski, professor of digital law at the Université Panthéon-Sorbonne, also proposes a right to parameterization. This intuitive, fair and privacy-friendly parameterization would enable users to define their preferences once and for all--such as refusing targeted advertising--without having to constantly reconfigure them.
Furthermore, it is crucial to introduce elements into interfaces that help us to reflect on our own automatisms, become aware of them, analyze them and work against those that don't suit us.
Finally, we feel it is important to promote a genuine cultural policy at state level to encourage other types of digital practice. Many of today's interfaces and platforms tend to isolate us and encourage conflict. Conversely, collaborative interfaces encourage collective construction and democratic participation. Wikipedia, for example, enables us to build an encyclopedia together, with mechanisms for contributing, expressing and resolving disagreements. This approach fosters mutual understanding and a shared focus on the same subject.
What role should the university play in these issues?
M. K.: I think it's essential to promote research on these subjects and open up access to interface and social network data, particularly to the humanities and social sciences. The university must be a place that facilitates interdisciplinary collaboration between specialists in AI, cognitive psychology, sociology, philosophy, law and so on, in these issues.
The university is also the place to inform students and develop their critical thinking and scientific approach, so that they can navigate the digital environment in a more enlightened way.
What message would you like to convey to the general public?
M. K.: My message is twofold. On one hand, the digital world has enormous potential and offers many positive opportunities. However, the current structure of the attention economy favors darker aspects, such as attention-grabbing. It is therefore essential to redirect this potential towards more beneficial uses, by better understanding these phenomena and proposing appropriate regulations.
On the other hand, we need to be aware of how this structure pushes us towards automatisms without even realizing it. We need to try to break out of it in order to regain greater freedom of action and continue moving towards a more peaceful, democratic society.