Laws Shouldn't Micro-Manage Tech Companies But Enable Accountability, Transparency: Nick Clegg
Laws Shouldn't Micro-Manage Tech Companies But Enable Accountability, Transparency: Nick Clegg
At today’s Facebook Fuel for India 2020 session, Facebook’s vice-president of global affairs and communications, Nick Clegg touched upon the need for regulatory framework to not impose minute provisions that hinder technology instead of enabling it.

Facebook has faced considerable criticism from governments around the world with regards to its practices – both in terms of its platform content and its market practices. Speaking at the Fuel for India 2020 conference earlier today, Nick Clegg, Facebook’s vice-president of global affairs and communications highlighted that while Facebook welcomes the scope for lawmakers and governments to establish new regulatory framework to define how new technology industries should operate, such laws should not hinder technology adoption in critical sectors. Instead, regulators must use this scope to increase transparency and accountability in technology companies, such as Facebook itself.

Speaking to Rudra Chaudhuri, director of Carnegie India, Clegg said, “It’s right that democratically elected rule makers should seek to introduce new guidelines and guard rails in how the internet operates. The most effective and intelligent approach to regulations is one that doesn’t seek to try and micro-manage every post and every line of content online – that is impossible for governments to do. Instead, hold companies like Facebook accountable and insist on a high level of transparency for the systems and policies that they have in place.”

Regulations in law enforcement

Talking about a “sensible” approach for governments when it comes to framing laws that seek to regulate global technology companies such as Amazon, Facebook, Google and the likes, Clegg stated that the ideal framework of laws will “not hinder the way the international economy relies on international data flows.” Raising this factor against the backdrop of end-to-end encryption and law enforcement agencies look for a way around it, Clegg underlined that while holding the sanctum of user privacy intent, Facebook has still succeeded in cooperating with law agencies on some fronts, in this matter too.

“Just because we can’t see the content, doesn’t mean that we cannot use signals that we do pick up – for example, the duration of the message, and not the actual content. We can use these signals so that we can go after people on encrypted messaging systems. We remove about 2 million WhatsApp accounts every month, particularly if we see that those accounts are being used for mass broadcasts – which is not the purpose of WhatsApp. We will continue to explain to policymakers that we think that billions of people around the world expect the privacy of an encrypted messaging service, even as we cooperate with law enforcement,” Clegg says.

Dealing with hate speech

Going beyond spam, Clegg also touched upon hate speech – an aspect that Facebook has had its own struggles with. While reports of targeted propaganda and hate speech on Facebook’s apps have been widespread, some have even trickled down to have real life impact in the offline world as well. These include, but are not limited to, cases of mob lynching based on misinformation on WhatsApp, hate speech directed at instilling divisive political bias in communities across the world, and more.

On this front, Clegg cites that Facebook publishes a quarterly transparency report in order to show its efforts against such targeted hate speech campaigns. He further highlights Facebook’s AI curation engine that seeks to automatically remove posts based on keywords, and of late, the much talked-about ‘Supreme Court’ of Facebook – the Oversight Board. Clegg also affirms that such curation is also done for private groups, which he says is Facebook’s effort towards reducing targeted socio-political hate on the platform. While Clegg offered a simplified instance of how Facebook applies its jurisdiction in matter with political intent, he also underlined that Facebook may not actively seek to localise content policy, even if national governments seek to install guide rails that demand so.

“Facebook has been under tremendous political pressure whether it’s in India or elsewhere, by people who have different or conflicting views about what should or shouldn’t be circulating freely on the internet, even though it is legal. We develop standards (pertaining to hate speech and propaganda) in a highly transparent way, for which we work with academics, experts and thought leaders including in India. However, given that we’re a global company, we have to have standards that we apply as evenly as we can. This doesn’t mean that there aren’t national legislative and other exceptional requirements that we don’t respond to. For instance, in Germany, it is illegal by law to say certain things about the holocaust, which is not the case in other countries,” he concludes.

The Facebook Fuel for India 2020 conference will continue in day two on Wednesday, December 16, where more details including the company’s India-specific efforts will come to the fore. As Facebook pushes for flaunting its transparency, it remains to be seen what lawmakers, including the precedent setting antitrust case it faces in its home turf, USA, make of the company and its seeming might – not just in the market as a corporate entity, but in society as a narrative deciding organisation.

Read all the Latest News, Breaking News and Coronavirus News here

What's your reaction?

Comments

https://wapozavr.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!