Apple has suspended Parler from the App Store as the app has failed to take “adequate measures to address the proliferation” of “threats to people’s safety.”
“We have always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity. Parler has not taken adequate measures to address the proliferation of these threats to people’s safety. We have suspended Parler from the App Store until they resolve these issues.”
According to MacRumors, Apple first sent Parler a letter on Friday morning. It stated that it had received “numerous complaints regarding objectionable content.” The app was also accused of being “used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021.”
Apple said there is “no place on our platform for threats of violence and illegal activity,” and unless the issue is resolved the app will remain unavailable on the App Store.
Here is the letter from Apple to Parler:
To the developers of the Parler app,
Thank you for your response regarding dangerous and harmful content on Parler. We have determined that the measures you describe are inadequate to address the proliferation of dangerous and objectionable content on your app.
Parler has not upheld its commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, and is not in compliance with the App Store Review Guidelines.
In your response, you referenced that Parler has been taking this content “very seriously for weeks.” However, the processes Parler has put in place to moderate or prevent the spread of dangerous and illegal content have proved insufficient. Specifically, we have continued to find direct threats of violence and calls to incite lawless action in violation of Guideline 1.1 – Safety – Objectionable Content.
Your response also references a moderation plan “for the time being,” which does not meet the ongoing requirements in Guideline 1.2 – Safety – User Generated content. While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues. A temporary “task force” is not a sufficient response given the widespread proliferation of harmful content.
For these reasons, your app will be removed from the App Store until we receive an update that is compliant with the App Store Review Guidelines and you have demonstrated your ability to effectively moderate and filter the dangerous and harmful content on your service.
App Review Board
Few macOS Monterey features will not come to Intel Macs
Apple previews macOS Monterey
Apple announces watchOS 8
Apple announces iCloud Plus, but the Private Relay feature won’t be available in China and select other countries
New privacy features coming to Mail and Safari
Apple TV Plus shares trailer for Oprah and Prince Harry’s docuseries
Apple announces a range of features designed for people with disabilities
Twitter subscription service might be called ‘Twitter Blue’
Apple’s Craig Federighi says the Mac has a malware problem
Twitter begins asking iOS users to enable app tracking
Apple promotes kids shows in new Apple TV Plus ad
Former Apple exec Scott Forstall shares how he was hired by Steve Jobs
Watch the trailer for Apple TV Plus thriller Defending Jacob
Apple Park left deserted amid COVID-19 concerns
Apple’s latest iPhone ad is Bokeh’d
- Apps & Software1 week ago
Apple TV app arrives on Nvidia Shield
- News1 week ago
Popular cryptocurrency platform Coinbase now supports Apple Pay
- Web & Social1 week ago
Facebook opens Messenger API for Instagram to all developers
- News1 week ago
Apple CEO Tim Cook and activist Malala Yousafzai talk on activism, learning to code, and more