Heeding Supreme Court, Facebook makes it harder to remove offensive content

Heeding Supreme Court, Facebook makes it harder to remove offensive content

NEW DELHI: Facebook, the world’s largest social network, has made it tougher for offensive content to be taken down and will do so only if it gets a legal or government notice, in keeping with a recent Supreme Court ruling.

“We have changed our process. So now, before we restrict content in India for illegality, we require that the government submit legal process to us and we scrutinise that with our legal teams,” Facebook’s global policy head Monika Bickert told ET by phone. “We would not restrict the content if somebody in the community, somebody outside the government, flagged that content.”

India had 14,971 content restriction requests in the July-December period, second only to France, and down from 15,155 requests in the first half of 2015, according to a Facebook report released on Thursday. The take-down calls were made by legal and government agencies as well as NGOs and Facebook members. The numbers in the next report will reflect Facebook’s decision to act only on legal or government requests.

India also made the second highest requests for user data at 5,561, after the US, which made 19,235 such demands. Facebook gives users the option to flag or report objectionable content including posts, photos, messages, comments, profiles, events and pages. Content is blocked or taken down if it violates Facebook’s community standards. Bickert clarified that the new rules do not mean that people can no longer report offensive content. “When people in India report content we will continue to look to see if it violates our community standards and if it does, we will remove it. It’s only in these unusual circumstances where it doesn’t violate our community standards but does violate the Indian law that we would require the government orders,” she said.

The Supreme Court last year read down Section 79 of the Information Technology Act, 2000, to say that a non-government request for taking down online content should be accompanied by a court order.

At the behest of law student Shreya Singhal, the court had also scrapped controversial Section 66A of the act, which gave the authorities wide powers to penalise free speech on online forums that were considered offensive.The case for compliance with local laws is strong for Facebook, which counts India as its second largest market with 142 million monthly active users. In the past three years, India has almost always been the top requestor for content restrictions worldwide, according to Facebook.

“For true transparency, the company will also need to start reporting the content removed suo moto under community standards,” said Chinmayi Arun, Executive Director of the Centre for Communication Governance at theNational Law University in New Delhi.

Bickert said Facebook works with governments across the world to make the reports better, but did not specify if the next report would have greater granularity about the kind of requests it receives or fulfils. People working with law enforcement agencies say Internet companies are usually stringent about restricting content. “It is a misconception that the police can just get access to data through backdoors,” said Rakshit Tandon, advisor to the cybercrime units with the Agra and Gurgaon Police, adding that there has to be a valid reason to remove or block content.

“It is only taken down after a court or legal order says an online action violates a law,” Tandon said.

In the latest report, Facebook said France saw a total of 37,695 content restrictions, with 32,100 of these being “instances of a single image related to the November 2015 terrorist attacks in Paris” that allegedly violated French laws related to the protection of human dignity.