Ofcom is due to gain new regulatory responsibilities later this year under the Online Safety Bill, which aims to help keep users of online services safe.
The bill, which is still going through the House of Lords, is expected to include 'priority' criminal offences that regulated services must consider as part of their overall safety duties.
In preparation for this, the Accelerated Capability Environment (ACE) has been supporting Ofcom to develop and widen its understanding across key areas, including how user-generated fraud and illegal harms are carried out, as well as how they can potentially be mitigated, to help build an evidence base.
This detailed understanding will be used to help inform future policymaking, including the implementation of a new framework to ensure online platforms have appropriate systems and processes in place to improve the safety of their users.
Ofcom has an established relationship with ACE and approached our Futures and Insight team to draw on its experienced analysts and wider sector expertise to undertake deep dives into each subject.
Futures and Insight analysts are able to draw on the best knowledge across domains, academia and industry to build up an interview list - and related insights - which are deeply relevant to the question at hand. This includes subject matter experts with a background working in relevant policy, technology, or trust and safety roles with social media platforms. These insights help build a robust picture of what is being done at individual platform levels, how effective those measures are, and what could be improved.
Ofcom has so far published two reports created by ACE, on user-generated content (UGC)-enabled frauds and scams and mitigating illegal harms: a snapshot.
Helping protect users
A global debate has emerged in recent years around the risk faced by internet users, and how they can be better protected from harmful content. One of the most common risks faced online is fraud.
ACE's Futures and Insight analysts conducted interviews with representatives from 15 tech platforms as well as people involved in anti-fraud mitigations in different fields, such as online banking, creating a snapshot report of the ways in which platforms are seeking to mitigate the risks associated with UGC-enabled frauds and scams.
This includes evidence and detail about how platforms capture fraud indicators, including technology monitoring. It also found that many platforms are interested in exploring new approaches, including greater cross-industry collaboration, which would require regulatory guidance on sharing data.
The second short paper was produced on another important concern - how illegal drugs and weapons sales, offers to facilitate illegal immigration, and the promotion of suicide and self-harm have increasingly developed an online component in recent years.
This paper, which was based on primary research supplemented by desk research, aims to provide a snapshot of the measures being taken to detect or prevent the sale or promotion of these illegal harms, as well as additional insights into the technologies used, the implementation costs and the barriers to further mitigation efforts. This latter category included a widespread lack of awareness about age verification requirements and gaps in livestream detection technology, as well as the sheer scale of the problem.
Ofcom intends to publish its first consultation document on these harms soon after the Online Safety Bill receives Royal Assent and its new powers commence.