According to a report by the Financial Times (opens in new tab) , Google has been working on a tool that can help moderate extremist material for smaller businesses such as start-ups which might not have the resources to do so.
The in-house project, worked on by Google’s Jigsaw division, which is tasked with challenging threats to open societies, continues to be developed in conjunction with the UN-backed Tech Against Terrorism.
Google says the particular initiative will be designed to help moderators detect and remove potentially illegal content, including racist and other hateful comments, from a website.
The project has been made possible by the database of terrorist items provided by the Global Internet Forum, founded by a collection of tech giants including the particular likes associated with Google, Meta, Microsoft, plus Twitter.
It’s designed specifically to support smaller companies that are unable to afford the resources needed for effective moderation, be it large teams of workers or expensive AI tools.
It’s an instrument that’s expected to be valuable in a time where extremists who have already been banned through major networks are choosing smaller sized platforms in order to express their views. It also serves as a protective measure with regard to companies reacting to the EU’s Digital Services Act and the UK’s upcoming Online Safety bill, which shall impose penalties on companies who fail to remove such content.
For now, it seems that it will operate upon an opt-in basis, meaning that businesses whose primary intention it is to harbour this kind of messages will continue to do so even in the face of potential fines.
It is believed that will two (unnamed) companies may test out the code later this year, indicating that a full roll-out is usually still some time away.
Elsewhere, Meta has rolled out its own tool which it calls Hasher-Matcher-Actioner (HMA). Like Jigsaw’s project, it has been designed to prevent the spread of hateful content, plus builds on the platform’s existing video and photo content material moderation tools.