close
close

Online Safety Act “lacking a vital opportunity” on the content of suicide – Irvine Times

Online Safety Act “lacking a vital opportunity” on the content of suicide – Irvine Times

The charity policy and the impact of the presenter said that the online safety regulator has currently been choosing to “ignore” tips from safety groups around how he decides which websites are within the new rules.

Jacqui Morrissey told the PA news agency that thecom had advised the government to use the site’s user numbers as criteria for subjected platforms to the most stringent measures of the law – which would require them to hide or remove such content even for users over age 18 years – but said this approach would leave a number of “small but very high -risk platforms” that are not subjected to these most difficult rules.

The charity aroused its fears of coinciding with the more secure Internet day (February 11).

“The Law on Online Safety includes really important provisions in it that allow the government to subjugate small but very high -risk platforms to the most stringent measures,” G -Ja Morrisi told PA.

“We are extremely disappointed that the COM has chosen to ignore our tips and advice of many organizations in the criteria for the most stringent measures focused on the number of users.

“We know that the Secretary of State has chosen to adopt the Covie’s advice, so in principle, it means that dangerous suicide and self -harm content will continue to be accessible to anyone over 18 years of age.

“At the age of 18, it does not make you less vulnerable to this content. Many studies show the part the Internet has played in suicides. “

Ms. Morrisi highlighted studies that have found use of Internet-related suicide in 26% of deaths in those under 20 and 13% of death at 20 to 24-year-old Possibility. “

“These are all people over 18 who will continue to come into contact with this very dangerous, very harmful content that will not be covered by the act at the moment,” she said.

“So we lack this truly vital opportunity – the government and ofcom said they recognized the concerns about the small but many risk services and stated that the decision of which services should meet these highest measures should be based on evidence.

“Our concern is, why don’t they listen to this proof?

“We have heard too often the role that online forums can play in someone who takes their lives. We know that coronae is clearly related to death with a particular site.

“So, how much more evidence is needed to recognize this level of harm caused to a site and to ensure that it is subject to the most powerful measures within the law?

“We say that it is essential that the government and OFCE are fully used the power of the Law on Online Safety to ensure that people are protected from this dangerous content and to ensure that the application will be effective as soon as possible. “

D -Ja Morrisy added that Samaritans believe that the level of risk – for example, if one of the OFcom or corron “reasonably connect one or more deaths” to a particular site – may be an appropriate approach to deciding whether a platform should be placed in the highest part of the rules.

ACOM spokesman said: โ€œFrom next month all platforms in the scope of the Law on Online Safety-Inclusively Small but Risk Services-will start to take action to protect people of all ages from illegal content, including illegal suicides and materials for self -harm.

“We will not hesitate to use our healthy powers to apply against sites that do not comply with these obligations – which may include applying to a court to block them in the UK in serious cases.

“Additional obligations such as consistent application conditions will be a powerful tool for larger platforms more festive.

“But they would do a little to deal with the harm caused by smaller, more risk sites -as many of them allow harmful content, which is legal.”

A government spokesman said: “Susophaging is devastating families and social media companies bear a clear responsibility to keep people safe on their platforms.

“We are determined to keep people safe online by quickly applying the Law on Online Safety, which will adjust the smaller platforms that spread the content of hatred and suicide.

“Platforms will need to remove proactively illegal suicide materials and, if available by children, protect them from harmful materials – whether they are a categorized service or not.

“We expect OFC to use the full set of its powers – including a fine and seeking court approval to block access to sites – if these platforms do not comply.”

Ms. Morrissey added that the charity was also “concerned” by recent “canceling back” by certain instruments to check the moderation and facts of some platforms, especially meta, which last month said it would replace the facts checks For third countries in the United States, community notes generated by consumer and would also loosen content moderate on some topics.

“We hope that what the Law of Online Safety is doing is to create a minimum standard, but we would hope for platforms to go beyond this and apply the best practices,” she said.

“We know that the act will not cover many species over 18, dangerous content, but platforms can choose to do so, they can choose to make their platforms as safely as possible for their users.

“So we are concerned that we may see some kind of progress with platforms recently about keeping people safe online. This puts us, in my opinion, a more unusual situation than we were a year ago.

“The Internet can be such a positive place. This can be a really important space for people who feel suicide, have access to really useful information, be able to talk to people who may experience such things in a safe and supportive environment.

“So technology platforms can be the head of the creation of these safe spaces to allow safe conversations to happen.”

Samaritans is available on 116 123 or www.samaritans.org/how-we-can-help/contact-samaritan/

Leave a Reply

Your email address will not be published. Required fields are marked *