Last Updated on 22/07/2019 by TDH Publishing (A)
Instagram is now reportedly strengthening its moderation policies which were witnessed today. The app is now ready to add a new alert that will warn users who violate rules when their account is close to being deleted.
The alert will display users a history of posts, comments, and stories that the app has had to remove from their account, as well as why the removed. “If you post something that goes against our guidelines again, your account may be deleted,” the page reads.
Users will be allowed to appeal for the content deleted for violations together with nudity and pornography, bullying and harassment, hate speech, drug sales, and counter-terrorism policies. And while users have long been able to appeal disabled accounts through the Help Centre, soon they will also be able to make their appeals directly in the app.
The change will help clarify for users why they are in trouble and will also remove the shock of suddenly finding that your account has vanished. While it’s possible that a great range of banned accounts is removed for obvious rule violations, Instagram, like its parent company Facebook, has regularly had moderation problems when it comes to nudity and sexuality, where users have had photos removed for posting pictures of breastfeeding or period blood.
This update won’t stop those mistakes (those kinds of photos are supposed to be allowed), but it would make appealing the decision easier.In addition to this new alert, the app is also going to provide its moderating team more leeway to ban bad actors. The app’s policy has been to ban users who post a certain percentage of violating content but it’ll now ban people who repeatedly violate its policies within a period, too.
The specifics here are all as vague as ever, as Instagram doesn’t want to offer details and let bad actors’ game the system, but it sounds like it could lead to fewer problematic accounts slithering through on a technicality.