Solving disinformation with Internet literacy
Fake people and fake news have existed on Internet since forever. They didn’t make such a terrible impact at the beginning because Internet in the 90’s lacked two things: users and amplification.
Internet users were an elite minority in the 90’s, mostly consisted of university students. They could see through fake. Some couldn’t and that was okay too, because fake news couldn’t go far either because there was no “retweet” button. You had to be very intentional about whom you wanted that piece of information reach to.
It took years for sensational sites (called ragebait today) like Bonsai Kitten or Manbeef to make an actual dent in the mainstream news. Even then, it didn’t matter, because the whole world didn’t move in synchrony to react to the news. It took years for people to discover the sites one by one, get offended personally or with a couple of friends, then figure out that they were actually fake, calm down and relax.
For a long time, the most sensational fake news on Ekşi Sözlük, the social platform I founded in 1999, were fake celebrity deaths. There was no moderation for fake news on the web site, on the contrary the footprint said “Everything on this website is wrong” which put the onus on the user to verify the information.
Fake celebrity deaths kept coming over the years, but people had started to be more careful. This kind of hands-on training created a new kind of Internet user: the literate.
The literate would be aware that any person or any news on the Internet could be fake, and the onus were on them to verify if they are real any time they consume the content or communicate with someone. They could see the typos in a phishing mail as red flags, they could see the signals in fake news, they could see how Anita22 could be a man using a fake profile picture. They knew that the prince of Nigeria couldn’t care less to give you a penny, they knew how to do reverse image search.

Unfortunately, the floodgates opened when iPhone revolution came and suddenly everyone on the planet had Internet access from their pockets. This new generation of people, our parents, relatives, the illiterate so to speak, didn’t go through the similar process that the literate had. They were clueless, lacked the tools, and could easily be manipulated. We let the sheep out to the meadow full of wolves.
That kind of user flow could still be managed well if we had slow and intentional sharing mechanics of the 90’s. You wanted to share something? You had to be very deliberate about it. You had to write an email, or forward it, but still you had to pick recipients yourself. Unfortunately, in 2010, we had already been part of the “amplification platforms” for years then. We call them social media today, but what they’re known for is their amplification mechanics. You consume the content from your little social bubble, but when you share it, it gets shared with the whole world. This asymmetric information flow could cause any kind of information regardless how true or credible to become a worldwide topic in mere hours.
The illiterate lacked tools to deal with this. They were forced inside their bubble. If someone had corrected a fake news article, the victim of disinformation would have no way to know about it. Because their bubble wouldn’t convey that information.
Because the illiterate are trapped in their bubble, they have no way to hone their skills about categorizing information themselves either. Their bubble consists of people mostly with similar levels of Internet literacy.
We basically lack healthy community structures on Internet.
I find Reddit one of the healthier platforms among the others in English-speaking world. Considering that how much it suffers from toxicity itself, the relativity of my statement begs emphasis. In fact, it resembles Ekşi Sözlük a lot in its community structure, but does some things better. For example, subreddits themselves can be great flow guards for information, and disinformation for that matter. Every subreddit is a broadcast channel and a community at the same time, causing people to focus their attention and efforts into only a single type of content. That leads them making fewer mistakes.
The conversation structure on Reddit, the hierarchical threads are better than Twitter’s mentions or Facebook’s replies which are linear. Reddit’s style helps organizing information better along with opposing views. The visibility of content is influenced by people’s votes, so, people have lots of tools to deal with disinformation on Reddit, and moderation may not be even the most important part of it.
There is a reason most controversial disinformation scandals appear on Twitter, because it has the most exponential amplification systems in place with as little controls as possible. There’s no downvote, no hierarchical discussion structure, no moderation hierarchy. Everything is designed to make content amplification easier: the retweet icon, retweets showing up on your main timeline with the same priority as original content, “trending topics” so you can retweet the most retweeted content even more. Everything becomes a feedback loop for more amplification of the content, not necessarily high quality content either. It just has to be controversial in some way.
Because Twitter is “social media”, people with the most followers have the most reach. Only a minority actually gets the most engagement on the platform, and the majority barely gets anything at all. This hurts content quality. Reddit doesn’t have this problem because amplification dynamics doesn’t take number of followers into account. You can have zero followers yet still get upvoted to the front page that day if you had produced content the subreddit community likes. I know, Reddit still has a “follower” mechanism, but it’s not the backbone of how engagement works on the site.
There are supposedly moderated communities which I think was an attempt to fix Twitter’s structural deficiencies, but the way they’re designed makes them pretty much useless. Should I share my content with the whole world through my followers, or just this small room of hundred people who cannot share my content even if they wanted to? I understand what problem they’re trying to solve: the uncontrolled amplification, but they fail to provide other ways for user to get engagement.
Elon Musk seems to be trying to address this problem with “For You” tab on Twitter which tries to make you contact with people outside your bubble, and let those people to promote their content to the people who’re not following them. But, the effective result becomes is just another way of uncontrolled flow of information causing disinformation distribute in new ways in addition to the existing ones.
Essentially, healthy communities are not profitable enough, and the engagement Rube Goldberg machine is fed with controversy. Controversy is the natural output of a for-profit social platform as it brings the most page views.
Paid memberships can’t fix this issue either because there’s no way that it can get near to the income generated by ads. Even if they can match the income, they can’t match the growth. The investors want immediate growth.
How do we fix this? Do we need more regulation? I don’t think so, because I don’t think people who design those regulations is capable to understand this problem better than the platforms themselves.
I believe that we need more Internet literacy. We have to make Internet literacy perhaps a prerequisite for accessing Internet. We need to teach it in schools, in courses, on web sites, but we have to have a curriculum for Internet. Let’s teach children about fake news, fake people, how people can harm us over the Internet, how trolls seek engagement, how they can protect their private information, and what tools can be used to avoid harm. Let’s teach them to navigate Internet skillfully and deliberately.
The literate will break this amplification chain because they will think twice before retweeting. They will research if they think the content seems off. Because of their education, they will ask for better platforms, they will migrate to them when they appear. The bad platforms will either have to change, or turn into a graveyard.