Parler Is Set for a Triumphant Return to Apple's App Store

Parler

Parler is back.

On the Apple App Store, that is.

Or, it will be.

You’ll no doubt recall a Big Tech move against the social media platform following January 6th’s chaos at the Capitol.

Advertisement

On the 9th, Apple removed the app from its store — despite it being the #1 download.

Google yanked it from Google Play, and it was axed from Amazon’s servers.

Parler attained new servers in February, with prominent shareholder Dan Bongino celebrating survival:

“Cancel culture came for us, and hit us with all they had. Yet we couldn’t be kept down. We’re back, and we’re ready to resume the struggle for freedom of expression, data sovereignty, and civil discourse. We thank our users for their loyalty during this incredibly challenging time.”

Along the path of that return, last month, congressional Republicans issued a letter.

Penned by Sen. Mike Lee (UT) and Rep. Ken Buck (CO), the March 31st missive to Google, Apple, and Amazon read (in part) thusly:

In just three days, Apple and Google effectively cut off Parler’s primary distribution channel, and Amazon cut off Parler’s access to critical computing services, leaving the company completely unable to serve its 15 million users. These actions were against a company that is not alleged to have violated any law. In fact, information provided by Parler to the House Oversight Committee revealed that Parler was assisting law enforcement even in advance of January 6th.

Advertisement

Thirty-three grilling questions followed.

Among them:

  • Please provide the specific provisions of your policies resulting, where applicable, in suspension or expulsion from distribution channels (Apple App Store and Google Play Store) or termination of cloud service.
  • Provide a complete history of all changes to policies that govern requirements for content moderation, including changes to definitions of what is acceptable and prohibited speech or conduct.
  • How many businesses were reviewed in 2020? Of the businesses reviewed in 2020, how many were reviewed because of content moderation practices?
  • How many were terminated?
  • What triggers the review process? Are outside inputs such as news reports used in
    decision making?
  • Who is involved during the review process? Is the process independent, or are all individuals participating in the review employees of the company?
  • Is there an appeal process for businesses notified of suspension? If yes, please describe.
  • List all businesses terminated/removed since 2017 as a result of content moderation policy violations, the date of their first notice and final termination/removal. How many of these businesses were in social media?
  • Was Parler given notice of the potential violation? Was the same amount of time offered to Parler to cure any potential policy violations as is given to other potential violators?
  • Who determined the amount of time, if any, provided for Parler to take remediation measures?
  • What was the basis for suspension or removal given to Parler in the initial notice?
  • Were there any contacts between any of your companies prior to the action against Parler? If so, with whom?
Advertisement

On Monday, Apple responded.

The tech giant asserted it “wants to provide a safe experience for users to get apps and a great opportunity for all developers to be successful.”

“Apple does this in part by curating the App Store,” it said, “including by reviewing apps to ensure compliance with all App Store Review Guidelines, which among other things set for standards for privacy safety, security, and performance of apps in the App Store.”

Apple claimed it reviews over “100,000 submissions per week, and [the company] rejects about 40% of them due to various Guidelines compliance issues.”

Before removing the app, Apple declared, it had communicated with Parler “regarding failures in [the site’s] content moderation efforts, as well as its desire stated at various times to not moderate content at all.”

Apps, its guidelines state, “should not include offensive or discriminatory content, including that which is likely to humiliate, intimidate, or harm a targeted individual or group.”

Advertisement

Apple specified it requires apps “with user-generated content” to provide the following:

  • a method for filtering objectionable material from being posted to the app;
  • a mechanism to report offensive content and timely responses to concerns;
  • the ability to block abusive users from the service; and
    published contact information so users can easily reach the developer.

Apple relayed it had determined Parler was in violation of some of its rules. Hence, on January 8th, it told the app to 86 certain content.

Per the explainer, Parler needed to follow up within 24 hours. But it “did not communicate a sufficient plan to improve its moderation of user-generated content in the app.”

Hence the booting.

On the other side of several purported conversations, April 14th saw approval for Parler to return to Apple’s store.

Therefore, it will be immediately available upon release.

As for coordinating with Amazon or Google, Apple denies any such collusion.

In January, some surely thought Parler had met its demise. Perhaps others assumed it would need to find some Apple-alternative path back.

In an age of swelling censorship, either of those may have easily turned out to be true.

But for now, it looks like Parler is on the rise.

It’s set to bear fruit — and with the help, or at least compliance, of Apple.

Advertisement

-ALEX

 

See more pieces from me:

Governor Matthew McConaughey? A New Poll Shines a Favorable Light

Advocates Speak out in Favor of Legalizing Consensual Incest

Christian College Sues the Biden Administration Over Joe’s Gender Identity Executive Order

Find all my RedState work here.

Thank you for reading! Please sound off in the Comments section below. 

Recommended

Join the conversation as a VIP Member

Trending on RedState Videos