The Tim Cook-X CSAM Brotherhood
Caroline Haskins, writing for Wired:
Elon Musk’s AI chatbot Grok is being used to flood X with thousands of sexualized images of adults and apparent minors wearing minimal clothing. Some of this content appears to not only violate X’s own policies, which prohibit sharing illegal content such as child sexual abuse material (CSAM), but may also violate the guidelines of Apple’s App Store and the Google Play store.
Over the past two years, Apple and Google removed a number of “nudify” and AI image-generation apps after investigations by the BBC and 404 Media found they were being advertised or used to effectively turn ordinary photos into explicit images of women without their consent.
It’s just preposterous to assume that allowing CSAM on the App Store or Google Play Store is a “concession” to the Trump regime in return for favorable regulatory treatment. This isn’t a golden gift display at the Oval Office; in fact, it has nothing to do with the president at all. Any suggestion that Apple and Google are keeping a CSAM generator, Grok, and CSAM publication platform, X, on the App Store and Google Play Store for regulatory benefit is complete nonsense.
I don’t know nor really care about the Google Play Store since Android is known to be a cesspool of pornographic — and probably illegal — apps, many of which can be sideloaded. But I do know that there’s an affair between Apple and X, beginning in 2022 when Musk took over the company, formerly known as Twitter. When Musk complained about the 30 percent In-App Purchase fee on the App Store, Tim Cook, Apple’s chief executive, personally invited Musk to Apple Park to sort things out. The result was that Apple would continue advertising on X despite practically every other major advertiser leaving the platform after Musk’s lax stance on content moderation. It took Musk agreeing with an antisemite for Apple to temporarily halt advertising on X before resuming last year.
My takeaway from this is that the reason CSAM remains on the App Store today is that Apple’s leadership clearly wants X’s business, irrespective of where that business comes from. When people sign up for X Premium to get access to the newly paywalled CSAM generation feature, Apple wants 30 percent of that revenue. The only logical conclusion is that Apple doesn’t care if that revenue comes from child abusers and pedophiles. It’s quid pro quo — Cook promised Musk that X would be welcome on the App Store as long as it keeps paying the 30 percent commission, and it doesn’t matter that Apple advertisements for the new iPhone show up alongside CSAM. Because ultimately, protecting children and punishing rapists is bad for Apple’s bottom line as long as it continues collecting a 30 percent commission from the purchases of those rapists.
Elizabeth Lopatto at The Verge never minces words, but I think her headline calling Cook and Sundar Pichai, Google’s chief executive, “cowards” is insufficient. They aren’t cowards. They’re not doing this because they fear retaliation. It’s child sexual abuse material. What honorable person could possibly retaliate against it? Cook and Pichai keep X and Grok on their app stores because doing business with child rapists is economically profitable for them. They’re not keeping CSAM on their platforms because they’re scared of something — they’re proud that CSAM makes them money. Prove me wrong.