Getty Images / Justin Sullivan

Instagram Will Soon Allow Users To Filter Out Hateful Comments

The new security feature is an effort to decrease harassment online.

By Lindsey Pulse | July 31, 2016

Instagram will soon launch a new filter for its estimated 500 million monthly active users to combat harassment online.

The new security feature will reportedly allow users to screen comments that appear on their photos and videos and even let them disable comments completely.

Of course, Instagram already has policies in place to flag certain words and phrases, but the new feature will give users the ability to control what appears on individual accounts.

Article Continues Below

An Instagram exec told The Washington Post, "Our goal is to make Instagram a friendly, fun and, most importantly, safe place for self expression."

SEE MORE: Beyhive Swarms Rachel Roy After Cryptic 'Good Hair' Instagram

Unfortunately, the new feature isn't available for everyone yet. The Post reports Instagram is rolling out the security measure for "high volume comment threads." Eventually, Instagram hopes to introduce the feature to the broader public.

The Pew Research Center estimates that 73 percent of adult users have seen someone be harassed online, and 40 percent of users have experienced it personally.

This video includes clips from Instagram and images from Getty Images. Music provided courtesy of APM Music. 

Want to see more stories like this?
Like Newsy on Facebook for More Social Media Coverage