Peace has never been just the lack of war. It needs an agreement on reality – a point where there is mutual knowledge on which individuals may disagree and argue and yet coexist. The foundation is crumbling down today. In the era of social media, 24-hour news cycles and viral posts, the truth is no longer a stable point of reference. It is what is filtered, distorted, and in some cases, totally changed to what story goes viral.
Digital revolution was supposed to make the world smarter and more global. In many ways, it did. But it produced as well something quite unintended, the greatest machine of misinformation ever created by people. And the implications are not ideal. They manifest themselves in the form of neighbors lacking trust in each other, democracies being rocked, and the common people in decision-making in their lives on the basis of the information that is not even true.
This essay considers the issue of misinformation and how it interferes with peace, not only in the global arena, but in real life, and what can realistically be done about it.
How Misinformation Works: The Basic Mechanism
It is significant to comprehend the reasons as to why misinformation is so easily disseminated before examining the consequences.
There are three key reasons:
· Our brains are programmed to react to things that are urgent, outrageous or threatening. A rumor of a politician contaminating the water supply travels more rapidly than an article about water quality standards by scientists because people do not want to be adversely affected by something, and fear is a more compelling motivator than information.
· Emotion spreads quicker than fact. Social sites such as Facebook, YouTube, and Instagram are developed to present viewers with content that will make them spend more time on the site. Emotionally colored and controversial content is more suitable than the cold, factual reporting. This is not whether something is true or not, and this is what this algorithm does not inquire. It provides the question of whether individuals are clicking on it.
· Engagement is rewarded by the algorithms and not accuracy. Studies in cognitive psychology reveal that the more one hears a statement whether false or true the more he stands to believe it. This is what is known as the illusion effect of truth. Misinformation campaigns capitalize on this by overflowing the information spaces with the same fake story over several channels systematically. The repetition presents the impression of truth. When combined, these three aspects result in a climate in which falsehood has an upper hand in comparison to truth. It is not an issue of personal ignorance, but it is a design problem.
Everyday Examples: Misinformation in Ordinary Life
One can easily imagine misinformation to occur in politics or war zones. It influences the daily decisions of people but most certainly makes them ordinary.
a. The Quit Your Job Illusion on the Internet: Instagram and YouTube are consumed with creators of content who demonstrate a life of traveling, luxury and financial freedom, all of which is allegedly accomplished by posting videos or photos. The message is simple, leave your job, and get a following and the money will follow. The truth of the situation is what is seldom displayed: the majority of the content makers receive nearly no money. According to a study by the influencer marketing agency Influencer Marketing Hub, most creators that have fewer than 10,000 followers earn less than USD 100 per month. It is this content that is monetized on the platforms. The aspirational economy offers classes, gear and training to individuals who think the dream is closer than it is. Thousands of youths have abandoned stable jobs in search of some form of success that was statistically close to impossible in their case. It is not only an individual economic misery, but a generation with diminished faith in traditional ways to find stability, and no realistic alternative.
b. Health Misinformation and the Anti-Vaccine Movement: In the COVID-19 pandemic, fake news about vaccines spread so fast on the social network that the World Health Organization has introduced the term infodemic to describe it. Instead of being guided by scientific data, people did not accept the vaccines due to viral posts, manipulated videos, and influencers who did not receive a medical education. The effects were quantifiable in number of lives lost. Prior to COVID, the number of children vaccinated in various countries had decreased as a result of misinformation that went as far back as a single fraudulent study retracted in 1998 that has nonetheless been impossible to ignore since that time due to the internet.
c. Financial Misinformation and Cryptocurrency: In 2021, celebrity tweets and viral posts on Reddit caused meme currency to soar in worth, attracting millions of ordinary people (many of whom are novice investors) to put money into it that they cannot afford to lose. At the burst of the bubble, the losses were devastating to small investors and those holding in early were already out. Social media spreading the culture of to the moon was a typical illustration of how misinformation, in this case, artificial enthusiasm, can be employed to transfer wealth among the uninformed to the informed.
d. Political Misinformation and Everyday Polarization: Misinformation does not just influence large scale elections. It alters the treatment because of their neighbors. In cases where fake news spreads of a certain religious group or political party as an alleged source of increased crime, or that a party is conspiring against the citizens, it taints the day-to-day interactions that hold a community together. Individuals cease to believe local institutions, evade particular stores or places, and form conclusions about strangers using data that was fabricated.
Misinformation and the Collapse of Shared Reality
Among the direst effects of the misinformation age is the fact that nowadays individuals living in one nation, sometimes even the same family, inhabit completely different information worlds. This happens to be referred to as reality fragmentation.
A productive conversation cannot take place when two individuals fail to come into an agreement on what is real. Losing a collective sense of reality by a large number of individuals results in society losing capacity to resolve collective issues. Take climate change: it has been apparent for decades that scientists agree on this. However, a mix of planned misinformation initiated by the fossil fuel lobby and stoked by social media has ensured that there is a significant number of true doubters. What it ends up achieving is a gridlock of policy making at the very time when it is required most.
This is not merely a political difference. It is one of those instances that misinformation has denied a collective effort to a collective threat, or, put differently, failure of peace.
Hannah Arendt, a philosopher who wrote after World War II, cautioned that totalitarianism does not start by making them believe in lies. It starts by rendering people incapable of distinguishing the truth and the fiction – by introducing a torrent of conflicting information to the point where the citizens abandon even the concept of truth. Today, when the era of deepfakes and AI-generated content has come, reading such an observation makes it less history than it makes it a timely word that was disregarded.
Misinformation as a Tool of Conflict
On the bigger scale, misinformation has not become a byproduct of the digital era, it is utilized as a weapon more and more often.
Through numerous studies by the EU, the US Senate and free researchers, it has been observed how some governments manage to coordinate their campaigns to divide other countries. Such operations do not require all people to believe in a certain lie. All that they have to do is to intensify current tensions, undermine institutions, and render citizens suspicious of one another and their states. A nation that is engaged in a civil war is less efficient in countering foreign pressure.: State-sponsored disinformation
Most people in Myanmar relied on Facebook as the source of news when the Rohingya crisis started increasing. The spread of fabricated stories and hate speech against the Muslim minority in the platform is carried out easily, in a language (Burmese) that Facebook had virtually no content moderators. Later, a UN fact finding mission reported that the platform incited violence which resulted in atrocities. It is not an imaginary injury. It is partly due to content that spread freely on a personal platform that people died. The Myanmar cases
Locally transmitted misinformation brings about actual conflict even without the backing of the state. Mob violence in India has been brought about by rumours spread through WhatsApp. Lynchings have been traced to false stories about food adulteration succeeding lynchings targeting particular religious communities. These are not far-away and abstract events, but the immediate, physical effect of unregulated misinformation within the everyday communities.: Closer to everyday life
The Gandhi Connection: Truth as the Root of Peace
The idea of Satyagraha of Mahatma Gandhi is generally presented as the resistance that is not violent. But that its literal meaning is truth-force. To Gandhi the concept of non-violence and truth were not two different concepts, but they were the same concept manifested in two forms. There is no way that you can construct peace on a falsehood. Injustice you cannot resist with falsification. The means shape the ends.
This has a direct implication in the present day. Good-intentioned individuals and like-minded peace movements may publicize unverified narratives due to the suitability of the story to their agenda. An untrue story of police brutality that spreads before the truth is determined, a blown-out-of-proportion statistic that goes viral during an environmental campaign, even a righteous cause, the misinformation creates distrust, makes resistance more difficult, and causes the polarization so characteristic of the movement in question.
The devotion to the truth is not a luxury of morality, which has time to spare. It is practical need of every one who would like his work to be a lasting one in peace.
What Can Be Done: A Point-wise Framework
It is not a small problem which has no solutions. The response needs to be realistic and must be performed at three levels:
i. Personal Level: Media Literacy: Each citizen must have basic skills to analyze information: Where does this come in? To the advantage of whom shall it belong that I believe this? Is this what a reputable source attest to? These questions are not difficult, yet they are not intuitive as well, they should be trained. Media literacy has been integrated into the school curriculum in Finland since the 1970s and has continued to make the country one of the most resistant to disinformation. It is a model that can be copied by other countries without use of massive expenditure.
ii. Platform Level Design Accountability: Social media companies are not a pipe. Their design preferences, what the algorithm encourages, the labeling of misinformation, the speed with which false content can go viral before it is marked, have a direct impact on the health of the population, democracy, and even social peace. Governments are starting to take action: the European Union Digital Services Act will have large platforms evaluate and reduce the risks that their systems have on society. It is a beginning, but it is still difficult to enforce it across borders.
iii. Institutional Level: Helping in Reliable Information: The infrastructure of truth is independent journalism, scientific institutions, and even public broadcasting; in the same way as roads are the infrastructure of trade. In the event of such institutions being under-financed, assaulted or undermined, the information environment becomes poor. It is not a luxury to invest in them, with public financing, protection of journalists under the law, and even civic backing. It constitutes national and international security.
iv. Cultural Level: Normalizing Correction: In most cultures, it is a social stigma to say that you have been wrong. This causes individuals to justify misinformation that they have already posted instead of making corrections. The normalization of just telling something that has turned out to be false and here is the correction like in schools, work places and in general life would do more to the information environment than many a complicated intervention.
Conclusion
In the 21 st century, diplomacy or lack of armed conflict cannot simply guarantee peace. It presupposes the existence of something lower: the possibility of people to have a common reality, to believe that whatever they cognize about the world has some connection with what is really true. Foundation is attacked by misinformation. It does it with algorithms that are geared towards engagement rather than accuracy, with the concerted efforts of the state and non-state actors, and with the mere human instinct of trusting in what will support what we already believe. The effects can be seen in economic destruction, in the democratic society crises in health, in ethnic conflicts, and in the gradual undermining of the civic trust that democratic societies rely on.The solution does not lie in suppressing speech or desiring the internet to disappear. It is to make truth, its creation, its defense, and its propagation, a social good, and to keep it alive one must work at it. Gandhi called it Satyagraha. Another way a modern translation would be: check before you share. Before you believe, ask. Before you repeat, verify. Those are small acts. However, peace is created with little things, with repetition, by a sufficient number of people, and at a steady rate.
By: Sameeksha S
Write and Win: Participate in Creative writing Contest & International Essay Contest and win fabulous prizes.