Tech firms must tame toxic algorithms to protect children online
๐ Abstract
The article discusses Ofcom's plans to introduce new regulations and guidelines to protect children online. It outlines over 40 practical measures that online services must take, including:
- Implementing robust age-checks to prevent children from accessing harmful content
- Ensuring that recommendation algorithms do not promote harmful content to children
- Improving content moderation to quickly remove content that is harmful to children
The article also mentions that Ofcom will be consulting on the proposed measures and that once approved, online services will have 3 months to conduct risk assessments and implement the required safety measures. Failure to comply can result in enforcement action and fines.
๐ Q&A
[01] Ofcom's Proposed Measures
1. What are the key measures Ofcom is proposing to protect children online?
- Require online services to implement robust age-checks to prevent children from accessing harmful content like suicide, self-harm, and pornography
- Mandate that recommendation algorithms filter out the most harmful content and reduce the visibility of other harmful content for child users
- Expect online services to have effective content moderation systems to quickly remove content that is harmful to children
2. How do the proposed measures go beyond current industry practices?
- The draft Codes of Practice set out much more stringent requirements than what is currently standard in the industry, demanding a "step-change" in how online services protect children.
- For example, the requirement for highly effective age-assurance checks and the need to configure recommendation algorithms to protect children goes well beyond what most platforms currently do.
3. What enforcement powers will Ofcom have to hold platforms accountable?
- Ofcom states they won't hesitate to use their full range of enforcement powers, including issuing sizeable fines, against platforms that fail to meet their legal duties under the new regulations.
- The article notes that once the Codes are approved by Parliament, Ofcom can begin enforcing the regime against non-compliant companies.
[02] Consultation and Implementation
1. What is the timeline for finalizing and implementing the new regulations?
- Ofcom is currently consulting on the draft Codes of Practice, with a deadline of July 17 for submissions.
- Ofcom expects to publish the final Children's Safety Codes of Practice within a year, after considering the consultation feedback.
- Once approved by Parliament, online services will have 3 months to conduct their required children's risk assessments and implement the safety measures.
2. How are children's voices being incorporated into the process?
- Ofcom states that children's voices have been central to designing the Codes, with input from over 15,000 young people over the past 12 months.
- As part of the consultation, Ofcom is also holding focused discussions with children across the UK to get their views on the proposed measures.