Skip to content
Advertisements

WHATSAPP REVEALS ITS PLAN TO DEAL WITH FAKE NEWS IN A LETTER TO GOVT

Five men who were rumoured to be child abductors according to a WhatsApp message were lynched to death by a mob in western Maharashtra on 1 July.

In an earlier report on 20 May, 14 people were reportedly killed on the basis of rumours which were circulated via social media channels such as WhatsApp. Reports of lynching triggering from fake news on WhatsApp have reportedly come from Maharashtra, Gujarat, Karnataka, Assam, West Bengal, Madhya Pradesh and Odisha.

The government of India in its response to the lynchings has asked WhatsApp to fix the fake news problem and prevent its use for malafide purposes. The government made it clear that WhatsApp was in such a position that it could not evade accountability and responsibility.

Facebook-owned WhatsApp in its response wrote a letter to the Ministry of Electronics and Information Technology said that it was “horrified” at the lynchings. It further added that they have been testing ways to curb the problem of fake news.

Here is the complete text of WhatsApp’s letter to the Ministry of Electronics and Information Technology, in response to the situation.

Representational image. Reuters.

Thank you for your letter dated July 2. Like the Government of India, we’re horrified by these terrible acts of violence and wanted to respond quickly to the very important issues you have raised. We believe this is a challenge that requires government, civil society and technology companies to work together.

Our strategy has been twofold:
First, to give people the controls and information they need to stay safe; and
Second, to work proactively to prevent misuse on WhatsApp.

WhatsApp cares deeply about people’s safety which is why we designed our app with security in mind from the get-go. For example, you can block anyone from messaging you with just one tap. And if someone who is not in your address book sends you a message, WhatsApp automatically asks if you want to block or report that user. We’ve also recently made a number of changes to group chats to prevent the spread of unwanted information, which we believe will address some of the specific issues you raise.

In Mid-May, we added new protections to prevent people from adding others back into groups which they had left — a form of misuse we think it is important to correct. And last week, we launched a new setting that enables administrators to decide who gets to send messages within individual groups. This will help reduce the spread of unwanted messages into important group conversations — as well as the forwarding of hoaxes and other content.

In addition, we have been testing a new label in India that highlights when a message has been forwarded versus composed by the sender. This could serve as an important signal for recipients to think twice before forwarding messages because it lets a user know if the content they received was written by the person they know or a potential rumour from someone else. We plan to launch this new feature soon.

Finally, just yesterday we announced a new project to work with leading academic experts in India to learn more about the spread of misinformation, which will help inform additional product improvements going forward — as well as help our efforts to block bad actors (see below) going forward.

Digital Literacy and Fact-checking

We are also working hard to educate people about how to stay safe online. For example, we regularly put out information that explains how to spot fake news and hoaxes — and we plan to run long-term public safety ad campaigns in India, given its importance to us at WhatsApp. As a starting point, we will soon publish new educational materials around misinformation and conduct our news literacy workshops.

This year, for the first time, we also started working with fact checking organizations to identify rumors and false news — and respond to them — using WhatsApp.

  • For example, during the recent Presidential election in Mexico, we worked closely with the news consortium Verificado. Users sent thousands of rumors to Verificado’s WhatsApp account and in turn, were provided regular updates on what was accurate and what was false
  • In Brazil, we are now working with 24 news organizations on a similar program — the learnings from our experiences in both countries will help us fight fake news in India
  • Already in India, the fact-checking organization Boom Liveis available on WhatsApp and has published numerous important reports on the source of the rumours that have contributed to the recent violence.

This kind of work gives everyone a better understanding of the problematic fake news circulating on WhatsApp, and how it relates to misinformation being shared on other platforms. In addition, it’s a helpful resource right within WhatsApp where people can get answers about content they’ve been sent. It’s why we’re looking at how best to ramp up these efforts in India going forward.

Proactive Action to Tackle Abuse

As you know, WhatsApp retains limited information and is end-to-end encrypted. We use this technology to protect our user’s privacy and security. While WhatsApp messages can be highly viral, the way people use the app is by nature still very private. Many people (nearly 25 percent in India) are not in a group; the majority of groups continue to be small (less than ten people); and nine in ten messages are still sent from just one person to another.

People are increasingly using WhatsApp to get advice from their doctor, do business or communicate with their bank — as well as to chat with family and friends. They want to know these messages are private and secure — and that no-one else is reading them. This focus on privacy brings many benefits, though as with all technology there are trade-offs. And for WhatsApp, that’s the inability to see problematic content spreading through private conversations on our app.

That said, we do have the ability to prevent spam, which includes some of the misinformation that can create mistrust and potentially violence. Because we cannot see the content of messages being sent over WhatsApp we block messages based on user reports and by the manner in which they are sent. We use machine learning to identify accounts by sending a high volume of messages (faster than any human could) and we are constantly working to improve our ability to stop unwanted automated messages.

We also respond to valid law enforcement requests to help them investigate crimes. And soon, we will start an engagement program with law enforcement officials around India so they are familiar with our approach and how we can be helpful. We also want to share best practices for how WhatsApp is used by local police as a resource for their community. For example, the police in Hyderabad have created a WhatsApp account that anyone can message with rumors that concern them. And by working with community leaders to get them using our latest features (see above), they can help keep their communities informed about hoaxes circulating locally. As we have already seen, this can help save lives.

If you would like to talk further about the actions we are taking and our plans going forward, please let us know. We believe that false news, misinformation and the spread of hoaxes are issues best tackled collectively: by the government, civil society, and technology companies working together. With the right action, we can help improve everyone’s safety by ensuring communities are better equipped to deal with malicious hoaxes and false information — while still enabling people to communicate reliably and privately across India.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: