Apple has removed the 500px for allowing porn

Is Apple overreacting by pulling 500px?

Almost anyone is familiar with Instagram, but another app 500px was removed from the App Store because it was said it was too easy to view nude images. This app was available for 16 months before it was pulled by Apple on Monday night, following a review. However, many users are left stunned that Apple will do this, even though Apple claimed there were reports of possible child pornography by use of this app.


While it was possible to view nude images, the developers had left a default search option that blocked these nude images, though it was possible to disable it. They relied on the community to censor many images that were deemed to be inappropriate. However, Apple deemed that this was insufficient and pull the app, despite 500px having nearly 1,000,000 downloads.

Personally, Apple made the wrong decision. It is much easier to simply go online and search for new images rather than using an app. Apple’s own browser can display nude images and other apps have the ability to view these kind of images. And the accusation of child pornography was only heard after the app had been pulled.

The app probably was pulled because it was perceived to have an improper content rating. Apple assigns content ratings to give users an idea of what the app should be used for. A rating of 4+ indicates there is no content that is inappropriate. A 9+ indicates there may be cartoon violence and 12+ means there might be mild sexually or violent content. A 17+ means that the app has heavy violence and mature themes.

However this rating system is unclear. Another popular app, Tumblr, has a 4+ rating even though it is possible to find pornographic images. Other apps such as Chrome have the 17+ rating because they can view any website as long as they don’t use flash. 500px had a rating of 4+.

Clarifying their guidelines:

This system should be clarified. There is a difference between intentional and unintentional uses for apps. It is Apple who should be held responsible for ensuring not only that apps are fairly judged but that their own systems are free of these kind of images.

In response, the developers of 500px have redeveloped the app and resubmitted it in order to have it back in the App Store. Hopefully for the developers, the app will be reinstated. Apple may want to reconsider how it goes about getting rid of apps so as to retain loyalty.

Matthew Mallicoat

I am a technical writer, very much involved in the world of technology and a lover of all things electronic. I love anything from PCs, video gaming, mobile devices, jailbreaking, etc.

  • Val

    There are TONS of nudes on Instagram. I accidentally discovered them when I called a friend a “#Pussy” and tapped the hashtag out of curiosity. There are probably 100 times more nudes on Instagram than 500px had, just due to the shear volume of photos posted there. How come we don’t hear about Apple pulling Instagram?

    *Edit: I just searched some obvious hashtags on IG & it seems they’re now blocked. They’re probably still being posted, just harder to find.