Instagram is celebrating its milestone of reaching 800 million users on the platform, and 500 million using it daily, with new comment moderation tools that make it easier to silence anyone being potentially offensive. The new comment tools mean that if you have a public account on Instagram, you can limit comments to specific groups of people, including your followers or people you follow. Plus, both public and private account holders will be able to block accounts from commenting on their posts.
This is part of an ongoing effort to help curb abuse on the platform, and Instagram is also updating the automated filter to block especially offensive comments today with new tools that cover Arabic, French, German and Portuguese-language content (it was English only at introduction). The company says this will also continue to improve with time and use, which makes sense since it’s powered by machine learning.
The company also expanded a feature it launched in May that is designed to help users in need of help and resources related to mental illness. If you’re on Instagram and you witness someone in need of help via a live broadcast, you can now report it anonymously, and Instagram will send them a message with an offer to help and access to a helpline and other resources. This program launched in May with a focus on standard posts, but now expands to live video, too.