
Instagram will now show a notification to warn users who violate its community guidelines if their account is close to getting disabled. The Facebook-owned company has also updated its policies for violating content. “These changes will help us quickly detect and remove accounts that repeatedly violate our policies,” Instagram said in a blog post.
Under Instagram’s new account disable policy, it will also remove accounts with a certain number of violations within a window of time. As of now, Instagram disables accounts that have a certain percentage of violating content. The company says the new policy is aimed at holding people “accountable for what they post on Instagram”.
A new notification process that will warn users if their account is at risk of being disabled has also been rolled out. It will also display a list of content including posts, comments or Stories by the user removed for violating Instagram’s Community guidelines along with a warning that their account may be deleted.
More importantly, it will give users a chance to delete content that they have posted in the past in violations of the company’s nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies. The posts that have been removed in error will be restored, says Instagram.
Earlier this month, Instagram rolled out a new feature to curb online bullying on its platform that takes advantage of artificial intelligence (AI) to notify people that their comment may be offensive even before posting. It will also start testing a new feature called Restrict feature that notifies people that their comment may be offensive and give people a chance to undo their comment.