Last year in Jharkhand state, in eastern India, a disturbing message circulated on WhatsApp. It warned that a gang of child kidnappers was operating in the area, carrying sedative injections, and that if anyone saw a stranger near their home, they should immediately call the police. The message spread quickly, ultimately erupting into mob violence that killed seven people.
This was just one example of the kinds of false stories proliferating on WhatsApp in India, where the number of internet users has almost doubled in the last two years, recently hitting half a billion. Often, these fake stories stoke communal tensions between Hindus and Muslims. They have real world consequences: at least 30 people have died this year in violence linked to forwarded WhatsApp messages, photos, and videos. As an election in 2019 looms, political parties are also getting in on the action, pushing out promotional content that is easy to forward on the messaging platform and directly contacting voters. In state elections in Uttar Pradesh in 2017, the Bharatiya Janata Party (BJP)–currently the party of state and federal government–created 6,000 WhatsApp groups alone.
In response to these events, WhatsApp has rolled out some changes this month for users worldwide. The first is a new label that tells you when any text, image, audio, or video message has been forwarded to you. The second is the removal of the “quick forward” button next to media messages. These changes are an attempt by the social media/messaging platform to stop the spread of misinformation. The “forwarded” label is supposed to help users determine whether a message was actually written by someone they know. In India, where people forward more messages, photos, and videos than anywhere else in the world, the changes are even more far-reaching: you will now only be able to forward a message to five chats at once.
Following the election of Donald Trump and the U.K.’s Brexit vote, there has been an ongoing discussion of the role of social media platforms in shaping democratic outcomes. In the Western world, this discussion has primarily focused on Facebook, Twitter, and, to a lesser extent Instagram, with intense scrutiny of individuals or accounts apparently working on behalf of the Russian government, amongst other actors, to influence voters.
WhatsApp–which, like Instagram, is owned by Facebook–has mostly been ignored. But WhatsApp is used very heavily around the world, with over 60 billion messages sent every day, many of them in Asia, Africa, and Latin America. Given the often Western-centric conversation around technology, WhatsApp’s exclusion from the discussion about social media and democracy is unsurprising–and it’s often seen as an alternative to SMS text messaging, rather than a platform with social media-like functionality and user behaviors. But if we overlook WhatsApp, we are missing a vital part of the story of how political participation and the consumption of information are changing.
The impact of WhatsApp on political campaigning is evident far beyond India. Reuters’ 2017 Digital News Report found that 51 percent of news consumers in Malaysia use WhatsApp to find, share, or discuss news. Even in countries with low internet penetration, the platform is facilitating the speedy dissemination of stories–both true and false. Earlier this year in Sierra Leone, where it’s estimated that less than 20 percent of people have access to the internet, a rumor nevertheless spread on WhatsApp that foreign peacekeepers were being deployed to the country. The totally untrue story spread to such an extent that the Inspector General of Police had to issue a formal press release denying the rumor.
As cheap smartphones become readily available across the world, more and more people can access social media; as a result, the way information spreads and the way people relate to those in power will continue to shift. The impact of WhatsApp on democracy around the world has significant overlap with broader questions around social media. How can the spread of false information be prevented? How can facts be proven and elevated? How can propaganda or other willful interference be avoided?
That encryption has its benefits–WhatsApp can be a useful organizing tool in times of censorship. Some researchers in India suggest that WhatsApp groups can give marginalized people–women, religious minorities–far greater access to public debate. But the flip side of this increased participation is the risk of misinformation spreading wildly, even more so than on more open platforms like Twitter and Facebook. Responding indirectly, by changing how information is forwarded, is one of the few methods available to WhatsApp that doesn’t simultaneously threaten user privacy.
“WhatsApp cares deeply about your safety,” the company said in a blog post announcing its recent changes. “We encourage you to think before sharing messages that were forwarded.” How much difference the new restrictions on forwarding messages will make remains to be seen.
How We Get To Next was a magazine that explored the future of science, technology, and culture from 2014 to 2019. This article is part of our The Internet section, where we report on the past, present, and future of the information superhighway. Click the logo to read more.