Social Media Firms Hold no Liability Towards Live-Stream Content as Per Laws
Social Media Firms Hold no Liability Towards Live-Stream Content as Per Laws
Facebook's several documents, included internal training manuals, spreadsheets and flowcharts, were leaked showing how the social media giant moderates issues such as hate speech, terrorism, pornography and self-harm on its platform.

On the heels of Facebook defending its "Content Policy" after the leak of its content moderation guidelines, a research analyst has said that existing laws on live broadcasts don't apply to the internet.

"The social media companies have no liability towards online content like murder, rape, terrorism and suicide under intermediary laws around the world. Social media companies' obligation is restricted to removing the illegal content on being informed of it," said Shobhit Srivastava, Research Analyst, Mobile Devices and Ecosystems at market research firm Counterpoint Research.

Earlier this week, Facebook's several documents, included internal training manuals, spreadsheets and flowcharts, were leaked showing how the social media giant moderates issues such as hate speech, terrorism, pornography and self-harm on its platform. Also read: Facebook Live Features: Add Friends, DM Viewers, Landscape Mode in New Update

Citing the leaks, the Guardian said that Facebook's moderators are overwhelmed with work and often have "just ten seconds" to make a decision on content posted on the platform.

"The recent incidents where harmful videos were posted online raise serious question on how social media companies moderate online content. Facebook has a very large user base (nearly two billion monthly active users) and is expanding, and therefore moderating content with help of content moderators is a difficult task," Srivastava told IANS.

"Facebook is also using a software to intercept content before it is posted online but it is still in early stages. This means that Facebook has to put a lot more effort to make the content safe," he added.

According to Monika Bickert, Head of Global Policy Management, Facebook, more than a billion people use Facebook on an average day and they share posts in dozens of languages.

A very small percentage of those will be reported to the company for investigation and the range of issues is broad -- from bullying and hate speech to terrorism -- and complex.

"Designing policies that both keep people safe and enable them to share freely means understanding emerging social issues and the way they manifest themselves online, and being able to respond quickly to millions of reports a week from people all over the world," she said.Also read: Samsung Galaxy S8 Iris Scanner Hack Reports Using Contact Lenses Under Investigation

Bickert said it is difficult for the company reviewers to understand the context.

"It's hard to judge the intent behind one post or the risk implied in another," she said.

The company does not always get things right, Bickert explained, but it believes that a middle ground between freedom and safety is ultimately the best answer.

She said that Facebook has to be "as objective as possible" in order to have consistent guidelines across every area it serves.

Srivastava noted that "from social and business point of view social media companies like Facebook, etc. have to dedicate more resources for content moderating purposes which are inadequate now, otherwise we will see various governments restricting access to these players which will spell bad news for both users and these companies."

Last month, Facebook announced that it was hiring additional 3,000 reviewers to ensure the right support for users.Also read: Microsoft Surface Pro: A 2X Accuracy Powerhouse With a Mobile Creative Studio

What's your reaction?

Comments

https://rawisda.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!