YouTube’s previous way of addressing toxic comments on YouTube was to warn people before a message was posted that it could potentially be abusive. The new feature goes a bit further, alerting users when YouTube has detected and removed some of their comments for violating its community guidelines. For those who ignore these warnings and continue to leave multiple abusive comments, YouTube could issue a timeout and suspend their ability to leave messages for 24 hours. Warnings and a 24-hour comment suspension might not seem like big deterrents, but YouTube community manager Rob writes that testing has shown the combination reduces the likelihood of users leaving violative comments again.
The new feature is currently only available for English comments. YouTube hopes to bring it to more languages in the coming month. “Our goal is to both protect creators from users trying to negatively impact the community via comments, as well as offer more transparency to users who may have had comments removed to policy violations and hopefully help them understand our Community Guidelines,” it said. YouTube admits that the systems don’t always get it right, so anyone who feels they received a warning that wasn’t merited can give feedback to help the company know it made a mistake. There’s no mention of reinstating comments that were removed in error, though. Spam messages are another big problem on the site. YouTube says it’s been working on improving its automated detection systems and machine learning models to identify and remove spammy posts. It has removed over 1.1 billion spammy comments in the first six months of 2022. Bots invading live chats are another area YouTube is trying to address, thanks to improved spambot detection.