Watch LIVE

Have You Heard of Instagram's Growing 'Instasex' Problem?


"Whenever you have a critical mass on the internet, the sex shows up."

Whether it be of landscapes, a group of friends or your latest meal, taking snapshots and applying a vintage-style filter has become exceedingly popular with Instagram. But these G and PG images aren't all that's being posted by the more than 80 million iPhone and Android owners using the photo-sharing app -- some are rated X.

The app, which was purchased by Facebook earlier this year for $1 billion, has already come under fire for hosting "thinspiration" photos, which promotes unhealthy eating disorders and body images. In April, the site instituted a ban on such images wanting to provide a "positive and healthy community."

But there are more nefarious tags still being used and treated with Instagram's trademark filters that "transform [a photo's] look and feel." The Huffington Post reports some photos on the site being labeled with “sextagram,” “instaporn,” “handbra,” and "instasex." In fact, while "latte" may have 135,000 photos associated with it, "instasex" has 201,000.

From using the app to post pornographic photos to facilitating actual hook ups, the Huffington Post reports New York University professor Terri Senft saying the recent growth of the site -- it has added more than 70 million users in the last year -- indicates a changing community:

“Instagram has moved from a niche thing to something people have heard about, and that means it has a critical mass. Whenever you have a critical mass on the internet, the sex shows up,” said Senft, a professor specializing in global media at New York University’s Department of Liberal Studies. “If something bills itself as non-pornographic then becomes that way, to me it’s a sign that it’s reached the public knowledge-base, and now it’s solidly there.”

Part of the problem is not that Instagram condones these images -- naked and suggestive photos are banned by its usage policy -- but how the company goes about policing them. In a recent blog post, Instagram said "we rely on the community to bring to our attention to activity that violates our Community Guidelines." If the community doesn't report the photos, they remain visible.

Sarah Perez on Tech Crunch noted earlier this year that with Instagram being considered an "artistic" venue at times, flagging inappropriate images becomes even more problematic:

When is a photo art, versus something encouraging a disease?


But I find it funny that the services are taking the time to worry about the sad, disturbed kids cutting and starving themselves, and yet, aren’t worried all that much about the fact that they’re hosting teens’ posts and photos alongside some very, very adult content. At least some porn sites have the decency to make kids do a little “what year were you born” math before seeing this kind of stuff.

Just as the social news site reddit had to crack down on pornography and child pornography being shared on its site last year, the Huffington Post noted parents being concerned over Instagram's seedier side as well:

Parents, in particular, could have cause for concern: Users as young as 13 years old can create Instagram accounts, and an Instagram username is all that is necessary to tap into the app’s “sextagram” underbelly. Michael Sheehan, author of the HighTechDad blog, recounted how a friend of his elementary school-aged daughter was contacted by an Instagram user who asked her to chat with her on Kik. Once they were chatting, the individual “asked to see this child’s privates,” Sheehan wrote.

Even with age restrictions and other measures taken by sites like Instagram, reddit, Facebook and Tumblr, Perez says parents need to realize the sites are "not that concerned about what kids see on their site."

Most recent
All Articles