While people globally use Wikipedia as their go-to information source, there’s typically no way of verifying if the content on the platform is correct. Meta is aiming to solve that issue with new AI software that fact-checks Wikipedia pages.
Meta’s AI utilizes what’s called natural language understanding (NLU) techniques to validate Wikipedia citations.
“Automated tools can help identify gibberish or statements that lack citations, but helping human editors determine whether a source actually backs up a claim is a much more complex task — one that requires an AI system’s depth of understanding and analysis,” Meta shared in a blog post announcing the AI.
With over 17,000 new articles added to Wikipedia each month, it’s nearly impossible to have a human fact-check each individual source, with some articles citing from dozens of different places. Instead, Meta’s AI will compare citations to the 134 million web pages on Sphere, an open-source online library.
To train the AI, Meta says it “fed [the] algorithms 4 million claims from Wikipedia, teaching them to zero in on a single source from a vast pool of webpages to validate each statement.” The model then assesses text in chunks and considers the most relevant passage when determining whether to recommend a Wikipedia article.
The project is still currently in the research phase and is not yet being used to update Wikipedia articles.
In other tech news, Twitter employees were reportedly warned against tweeting or discussing matters regarding the Elon Musk deal.