What if algorithms prevented you from daring to think outside the box ?
Every day, our lives are shaped by small actions that may seem harmless and spontaneous at first glance, yet are in fact governed by algorithms. Omnipresent yet operating behind the scenes, these algorithms analyse our slightest behaviours to suggest our next sunny destination, help us save time, or push us to buy the latest discounted item, sometimes through something as simple as the use of the colour red, which triggers action. Designed to manipulate the subconscious, these tools are powerful instruments capable of influencing our decisions, our worldview, and our ways of thinking.
We are exposed to them daily, often without even noticing. “How do I find the best ChatGPT prompt to get views?” or “What’s the best format that performs well on LinkedIn?” These are the kinds of questions that flood social media every day, questions that ultimately serve and feed machines more than they serve us.
The Paradox of Optimization
Algorithms have become powerful engines of conformity. The fear of not producing “high-performing” content, or the fear of one’s work being overlooked, the fear of missing a trend or falling behind, these fears gradually take up more space than the desire to take risks. Algorithms do not create these fears; they industrialize them and exploit them to better understand and target us. This vicious cycle should alarm us far more than the fear of failure or of not publishing trendy content.
The algorithm does not dictate what we publish. It dictates what we eliminate before we even start writing. Think of all the ideas you never dared to share, all the thoughts you chose not to express because you feared of being judged. Yes, posting at a specific time or following the “perfect” prompt to boost sales may help, but it will mostly make you dependent on the dopamine these algorithms generate.
The algorithm does not standardise the world; it reveals how willing we are to standardise ourselves. Paradoxically, the more we try to please it, the more our content begins to look the same. The more we optimize, the less room we leave for creativity. The more we seek visibility, the more interchangeable we become.
Comfort at the expense of effort
Relying on algorithms creates a certain sense of comfort: the comfort of avoiding effort and letting ourselves be guided by suggestions. A few years ago, while traveling, you might have opened a map, chosen to get lost, wandered aimlessly, and let discoveries unfold naturally. Today, watching a video that lasts only a few seconds is enough to influence which restaurant you will choose for dinner or which places you will visit during your stay. But this comfort leaves little room for surprises or unknown. It no longer allows space for imagination or for boredom, which is nevertheless essential to a child’s development.
Algorithms: an ode to reward
You have most likely already heard of dopamine, the hormone and neurotransmitter at the core of your brain’s chemistry, activated whenever you receive or anticipate a reward. This reward produces short-term happiness and encourages you to repeat the behavior that led to that small dose of satisfaction. The creators of today’s social networks understand this perfectly. That is precisely why algorithms are designed to deliver a daily dose of dopamine, pushing you to return to these platforms again and again.
For instance, when we receive a notification, it triggers stimulation that generates dopamine, compelling us to reopen our apps. Similarly, when a post performs exceptionally well, (receiving likes, shares, and comments etc) it stimulates us to the point where we want to repeat the experience, chasing that same feeling once more.
A powerful yet dangerous instrument of opinion shaping
One thing the architects of algorithms have understood remarkably well is that these systems do not only operate on logic or data. They follow the winding paths of the human brain, tapping into our unconscious biases to subtly influence how we think, what we believe, and ultimately how we make choices.
A striking illustration of this dynamic can be found in Donald Trump’s 2024 presidential campaign, which heavily leveraged platforms such as TikTok and X to reach and mobilize younger voters. In a press conference, Trump openly acknowledged this strategy, stating: “I have a soft spot for TikTok, because I won the youth vote by 34 points, and some people believe that TikTok had something to do with that.”
At the same time, Elon Musk had acquired X, becoming not only the platform’s owner but also the custodian of its algorithmic levers. Throughout the campaign, he used this position to amplify and endorse Trump’s candidacy, actively shaping the flow of political content on the platform. In a telling turn of events, Musk’s outspoken support was later followed by his appointment to a role focused on government efficiency under the new administration.
As social media platforms increasingly position themselves as primary sources of information, such entanglements raise pressing ethical questions. Who controls the algorithms that shape public discourse? To what extent can these invisible systems influence not only public opinions, but the very structure of political debate itself?
In the end, perhaps the most meaningful response to algorithmic conformity is to dare to publish what feels singular, imperfect, and deeply human. Not content designed to perform, but content that surprises, amuses, provokes curiosity, and reflects who we truly are. Because long after optimization fades, it is authenticity that still speaks to us.
Sources :
Rendre les algorithmes visibles : pour un droit à la curiosité
https://psyaparis.fr/dopamine-reseaux-sociaux/
https://www.20minutes.fr/monde/election-presidentielle-americaine/4113283-20241006-election-americaine-2024-elon-musk-droit-utiliser-x-favoriser-campagne-donald-trump




