Translate in another language

Perspective is everything or is it?

Perspective is everything or is it?

Hi There,

This is a story about mis-information. This is also a story about how network effects that have been beneficial to us turn our ability to find useful information from a web of useless or malicious ones …..slightly useless.

Let us begin with China.

December 2018, the world looked ready for the age of China. Sometime in 2019 the introduction of controversial law started protests and Tencent (a major media partner of NBA), which had booked US$ 1.5 billion dollars of a streaming deal suddenly started calling basketball stars who supported the Hong Kong democracy movements. In 2019, Marco Rubio began questioning the Chinese hold on the American tech scene. By 2020, Tech companies like Huawei, were either pariahs in the 5G network race or deeply involved in a controversy between two large states. From 2020, onwards, the spread of misinformation was at the forefront of the news. Part of the problem was the Chinese state's attempts at controlling information beyond doubt and scale. In late 2020, the Chinese government launched an immense amount of propaganda on social media platforms like WeChat that attempted to tarnish the image of the Hong Kong protest movement. This begs the question, how does misinformation spread? Who spreads it? And why? There is no doubt that online media can serve both as a tool for disseminating information and as a tool to control public opinion, but what separates the real from the fake often depends on one's perspective.

For the past decade or so, the internet has become a powerful platform for disseminating information across the globe. In this age of disinformation, where the lines between the 'fake' and the 'real' are blurred, the accessibility of social media platforms makes it easier than ever for propaganda to be disseminated and adopted by the general public.

The underlying reason why the information gets distorted is that people are prone to developing various biases as the brain processes information. In the digital age, We consume a considerable amount of information everyday through popular social media channels causing information overload. As a result, people are inclined to engage in biased thinking as they try to make sense of all the information they receive every day.

These biases then affect how people interpret the information presented to them, which in turn affects how they act on this information. For example, people generally have a confirmation bias. This means that they tend to believe information that supports their existing beliefs and ignore information that contradicts them. In addition, they also tend to have a negativity bias. This means that they are more receptive to negative information or stimuli than positive ones. If we apply these biases to the example of disinformation about the Hong Kong protests, we can see that the way this is disseminated online via social media like Weibo and Twitter makes it very difficult to determine the truth. This happens because most of the information available on these platforms comes from users who are also biased and have certain agendas. In fact, most people on social media are driven more by their personal biases than by facts and objectivity.

What causes people to succumb to fake news? The major reason behind it is our inability to distinguish between fact and fiction. Our brains are easily influenced by emotions and our own experiences and we are often unable to separate fact from fiction. Lazy thinking is attributed as another factor behind the spread of misinformation on social media. False news stories often take advantage of people’s fear and insecurity about various events occurring around the world to encourage them to react in a certain way.

Reframing the narrative isn't something that can be easily done. People do not share an article just because they believe it to be true; they do so because they have convinced themselves that it is true – and that it is information that they know the other person wants to hear. The power of reframing a narrative lies in the power of influence. Because social media is so heavily used as a tool for social connection, it has become a powerful force in shaping opinion and creating "realities" through group consensus.

When people are presented with information that is contradictory to their beliefs, their tendency to selectively process information that is congruent with their beliefs causes them to misinterpret the other information and dismiss it as "fake news". They then go on to share this information with their friends and followers without thinking about whether it is true or not. Once a lot of people share the same information, it becomes viral and starts gaining wider acceptance. This creates a lot of confusion among people who attempt to figure out what is true and what is not.

On November 10th, A fake Twitter account claiming to represent Eli Lilly and Co. tweeted: "We are excited to announce insulin is free now". By the time Twitter removed the tweet more than six hours later, the account had inspired other fake Eli Lilly copycats. The company is among the world's largest and spends more than $100 million on advertising in the U.S., according to data firm MediaRadar. Ex-Twitter employees say Twitter's new $8 verification system has decimated some of the last lingering bits of trust among advertisers.

As a Twitter user since 2007, and someone who got his start in journalism because of the social network—that social-media platforms do not encourage the kind of behavior that anchors a democratic government. On platforms where every user is at once a reader, a writer, and a publisher, falsehoods are too seductive not to succeed: The thrill of novelty is too alluring, the titillation of disgust too difficult to transcend. After a long and aggravating day, even the most staid user might find themselves lunging for the politically advantageous rumor. Amid an anxious election season, even the most public-minded user might subvert their higher interest to win an argument.

In short, social media seems to systematically amplify falsehood at the expense of the truth, and no one—neither experts nor politicians nor tech companies—knows how to reverse that trend. It is a dangerous moment for any system of government premised on a common public reality.

The question is – is there a way to safely navigate the negativity of the social information overload without being sucked into the black hole? That is how can we not be absentminded participants on social media? Perhaps our answer lies in the psychology of confirmation bias and in broader epistemological questions such as how we know what we know. How can we distinguish fact from fiction? Most importantly, how can we stop spreading misinformation and end up just amplifying our existing biases and echo chambers?

It is unclear which interventions could reverse this tendency toward falsehood. Proposed solutions include nudging people in the right direction with customized news algorithms or accuracy primes (using reminders to shift attention to accurate content), but researchers warn these attempts would not address the psychology behind why people are predisposed to share false stories in the first place. And the replication results show that the priming effect disappears quickly because such cues no longer stand out in the brain after they have been experienced a few times. The same researchers have found that using inoculation theory, which is pre-emptively exposing people to potential misinformation can shift opinions or beliefs, and may make people less susceptible to misinformation.

The answer may not lie singularly in algorithms that exclude certain content or curate our newsfeeds but in aligning the technology we already have with the psychological principles that influence the way we interact with it and with each other online. The challenge posed by disinformation is not only technological but also human, and needs to be dealt with as such. Multi-layered strategies will need to be implemented to counteract this infodemic, from advances in AI and machine learning to the careful consideration of how information is shared and curated across different platforms.

Perspective is everything when it comes to misinformation. This is a world where truth is very much in the eye of the beholder. There is no objective definition of truth and there is no algorithm that can accurately determine it for us. The truth depends on who is telling the story and what their agenda is. The picture below is an example, from one angle it is an obscene gesture from a privileged royal, the other (true one) is probably him demanding 3 of something. The tabloids could spin it whichever way they would want and the share, and spin a yarn would continue forever.

Perspective is important.
and all of us at the behaviouralreview

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to IP Wave.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.