BREAKING NEWS
latest

728x90

header-ad

468x60

header-ad
DutchEnglishFrenchGermanItalianPortugueseRussianSpanish

Wednesday, February 12, 2020

Facebook, YouTube, and Twitter put on notice as UK names new internet regulator

baby brain lede

The UK government has announced ambitious new plans to regulate the internet which will give websites a “duty of care” to protect their UK users from illegal content related to child exploitation and terrorism, as well as harmful content more generally. The proposals say that the UK’s existing broadcast regulator Ofcom will be responsible for enforcing the new rules, which are expected to include the power to fine internet companies that don’t comply.

Full details of the legislation and Ofcom’s power to enforce it are set to be announced later this spring. Although the government will set the direction of the legislation, Ofcom will have the flexibility to decide how to respond to new “online dangers” as they emerge.

Details coming later in the spring

The proposals include two main requirements, The Guardian notes. The first is for illegal content, such as that depicting child sexual abuse or promoting terrorism, to be removed quickly and even prevented from being posted in the first place.

For content that is merely “harmful” rather than illegal, online platforms will be required to be upfront about what behavior and content is acceptable on their sites, and enforce those rules consistently and transparently. This covers content that may encourage or glorify self-harm or suicide. The government believes that flexibility is necessary to protect users’ rights online, including free speech and press freedoms.

The government says the regulation will apply to any websites that allow user-generated content, which includes comments, forums, or video sharing. This definition suggests that it won’t just be social media networks that will be impacted by the regulation. Sites that are deemed to pose a low risk to the general pubic will not be covered.

However, as we have seen, online content moderation is a huge challenge even for the largest tech companies like Facebook and Google that can easily afford it. It’s likely to be even more difficult for smaller organizations.

In a statement, Ofcom said it welcomed the decision to be appointed regulator. “We will work with the Government to help ensure that regulation provides effective protection for people online and, if appointed, will consider what voluntary steps can be taken in advance of legislation,” it said.

Developing...

Original Article ©Copyrights theverge.com
« PREV
NEXT »

No comments