Instagram censorship continues: new tools make it easier than ever to hide sensitive content

Instagram censorship: sensitive content

This week, Instagram announced a new “sensitive content” policy — a new censorship tool — under the guise of “fostering kindness”.

“Soon you may notice a screen over sensitive photos and videos when you scroll through your feed or visit a profile. While these posts don’t violate our guidelines, someone in the community has reported them and our review team has confirmed they are sensitive.”

This “screen” appears in the form of a blurry photo and a text message about “sensitive content” laid overtop. This is a massive problem.

Like all of Instagram’s best ideas, they steal them from other social networks. Twitter has had essentially this same system in place for many years, displaying a warning about sensitive content based on whether or not the user has self-flagged their account. Other image sharing platforms like Flickr and 500px also allow users to self-manage whether or not their accounts display sensitive or “adult” content and whether or not a user wishes to view it. To me, this is the right way of going about things, as it places the burden on the user instead of a team of Instagram employees who have already proven time after time that they are not competent to differentiate types of content.

That begs a question: What are the chances that Instagram will flag these images correctly? Zero. As we’ve seen countless times in the past, Instagram has shamefully removed images that didn’t even violate their terms of service. What type of images? Images containing period blood, mastectomy scars, breastfeeding, and others. Photos of normal, everyday things that everyday people live with… everyday.

How does Instagram define sensitive content?

One main difference between Instagram and Twitter, 500px, and Flickr, is that the latter three platforms all explicitly allow nudity, whereas Instagram does not. On these three platforms, it’s then easy to define and flag sensitive content. It’s a simple “yes or no” proposition. By contrast, Instagram has put themselves in a weird in-between place with three categories for content: safe content, sensitive content, and forbidden content. How does Instagram define “sensitive”? We don’t know, they won’t tell us. Instagram even created a new site dedicated to safety, but there is no mention about what constitutes “sensitive content”. The Instagram Help section is equally useless. They point to their Community Guidelines for further information, though it contains nothing new in regards to the new “sensitive” designation. There is this, however:

“You may find content you don’t like, but doesn’t violate the Community Guidelines. If that happens, you can unfollow or block the person who posted it”

So just to recap:

  • Instagram has said that “sensitive content” photos are ones that “don’t violate our guidelines”
  • Instagram encourages unfollowing or blocking — not reporting — when discovering content you don’t like
  • Instagram will censor “sensitive” images when “someone in the community has reported them”

These statements from Instagram are clear as mud. If sensitive photos don’t violate guidelines, why would they be reported? If unfollowing and blocking are the preferred resolutions, why does Instagram specifically call out the ability to report a “sensitive” image? If they want to now encourage reporting instead of unfollowing and blocking, why not change the Community Guidelines to reflect that? As usual with most of Instagram’s new features, there are more questions than answers. Good luck contacting anyone at Instagram to get those answers.

What avenues for appeal will there be if they get it wrong? We don’t know yet. It’s more likely than not that Instagram’s decisions will be final, with no ability to appeal, but we’ll have to wait for this new tool to be more widely rolled out to see what options, if any, are available to those impacted by it.

What’s painful for many Instagram users is that Instagram appears to be focusing more time, effort, money, and resources on introducing new features of questionable utility, when many existing features don’t work as intended, are outright broken, or riddled with bugs. As I’ve detailed here before, Instagram has a massive issue on its hands with hashtags and shadow bans, not to mention their complete lack of technical support, and countless user reports of problems with drafts, two factor authentication, spontaneous log-outs, and hacked accounts among others. Instagram shocked many people when they outright stated on their Instagram for Business Facebook Page that they don’t have the resources to fix existing problems. Now is a good time to remind you that their parent company, Facebook, brought in $27 billion last year. Sounds like a lot of resources to me. This isn’t some fly-by-night operation from an indie development studio. And yet Instagram acts like their app is still in beta testing.

Sensitive content is great for bullies

So getting back to sensitive content — you might ask what the big deal is. After all, users can just tap on the blurred photos to reveal them — no harm, no foul, right? But it won’t take long until this new tool is used for body shaming and body policing, directed mostly at women.

The body positive movement has made great strides over the last few years, spreading their message of inclusivity on platforms like Instagram to much success… and a little controversy. For every curvy girl who posts a photo in a crop top or a bikini, there’s another user commenting that it glorifies or promotes obesity. Regardless of your view on this, it’s not hard to imagine a scenario where someone offended by an (otherwise innocuous) image “glorifying obesity” will report that image, and one of Instagram’s censorship police agrees (intentionally or mistakenly) with the report, censoring it from view. If you’ve ever wanted to watch the #bodypositive movement go nuclear, this will be their moment. Accidentally censoring images of fat people is just the start — it won’t be long until this extends to images of disabled people, amputees, eating disorder sufferers, and other images that don’t show a person with perfect skin, ideal weight, all four limbs, and no scars. As seen with the previous examples of period bloodmastectomy scars, and breastfeeding, Instagram has an awful track record and an itchy trigger finger.

Counting down the days until this blows up in Instagram’s face…

Leave a Reply