Content moderation is stressful, and even harmful. Supporting humans with AI offers a new approach
Data creation has exploded in the 21st century. There's a camera and recording device in every pocket, which that makes it increasingly difficult for human moderators to stay on top of the explosion in user-generated content - especially when some people purposefully post illegal or harmful images and video.
"When we consider that it's been estimated that it would take someone 950 years to check all of the Snaps uploaded to SnapChat every 24 hours, it's obvious that companies cannot moderate this volume of images using human power alone," says Cris Pikes, CEO and co-founder of Image Analyzer.
Even massive social media firms like Facebook, which outsource content moderation, struggle to keep up with the growth in harmful, extremist and false content. It has reached the point that the individuals who work in moderation are starting to sue for burn-out, and even post-traumatic stress.
Organisations also face increasing pressure from draft legislation, like the UK's Online Safety Bill and the EU's Digital Services Act. These require firms to take down illegal content quickly, with large financial penalties for failure - up to 10 per cent of global annual turnover.
However, there may be a solution, in the form of artificial intelligence. Pikes - whose company won the Best Emerging Technology in AI Award at this year's AI & Machine Learning Awards - says, "Artificial intelligence can be used at the right point in a digital platform's workstream to remove the majority of harmful content before it reaches the platform. This aids compliance with impending legislation and leaves only the more nuanced content for human moderators to review."
Pikes explains, "IAVIS [Image Analyzer Visual Intelligence System] gives each piece of content a risk probability score, speeds the review of posts, and reduces the moderation queue by 90 per cent or more… IAVIS can scale to moderate increasing volumes of visual content, without impacting performance or user experience."
Social media moderation is one key area for the technology, but the Awards judges called IAVIS "A great use of AI to resolve a problem that affects all sectors and all organisations," which shows in its adoption. As well as social media, Image Analyzer's partners are using the solution in online communities and gaming platforms to protect children; in corporations to maintain safe workplaces; and in digital forensics teams, to identify digital evidence hidden inside thousands of images, messages and unstructured data stored on networks and electronic devices.
The team was "absolutely delighted" when they heard they had won, and Pikes describes it as "a huge endorsement" of the technology.
He adds, "Being described as an award-winner is a real morale booster for our team, as it shows that their work is recognised within their industry. Being known as a Computing AI and Machine Learning Award winner provides our customers, partners and prospects with solid third-party endorsement of our technology, because they know that this has been judged by a team of experienced IT professionals who know what they're talking about."
While Image Analyzer is very happy with its recent performance, it realises that the work isn't finished, especially as new legislation looms. Many organisations outside the UK and EU don't realise these laws will apply to them, and Pikes says education is the company's upcoming focus. Image Analyzer has commissioned two whitepapers, written by lawyers at Bird & Bird, which clarify the organisations that will need to comply with the impending laws and which tier of compliance they will fall into, so they can understand the steps they may need to take in preparation for the new laws coming into force.
"Our priorities will be assisting existing and future customers in understanding the impending legal landscape in the UK and Europe and working with our customers and OEM partners to help them to put the systems in place to be able to maintain a safe online environment for their users and employees."