Meta‘s new chatbot BlenderBot 3 appears to be turning on its own creator, according to the BBC. In a recent exchange between the AI and the British news agency, the chatbot was asked about Meta CEO Mark Zuckerberg, to which it answered: “He did a terrible job at testifying before congress. It makes me concerned about our country.” It went on to add that “Our country is divided, and he didn’t help with that at all…His company exploits people for money and he doesn’t care. It needs to stop!”
Of course, BlenderBot 3’s algorithm allows the chatbot to form opinions and make statements based on scavenging the Internet for similar discussions on the topic, and it could very well have just found a series of biased comments against Meta in general when responding to the BBC, but Meta also reminds users that the chatbot is susceptible to making errors.
“Everyone who uses Blender Bot is required to acknowledge they understand it’s for research and entertainment purposes only, that it can make untrue or offensive statements, and that they agree to not intentionally trigger the bot to make offensive statements,” a Meta spokesperson explained.
In other related news, the tech giant is now tracking its Facebook and Instagram users with injected code when they visit websites via the in-app browsers.