Scammers are abusing Apple's App Store, with some flooding the digital marketspace with bogus and buggy ChatGPT apps.
Apple's App Store is intended as a digital storefront that vets apps before they are provided to Mac users as a download. While Apple's policies and checks do catch most bad actors, it seems that the influx of AI chat apps has surfaced some who are abusing the system.
A weekend report by Alex Kleber on the "Dark Side of the Mac App Store" reveals the results of a month-long investigation into a group of apps claiming to provide ChatGPT-style services.
The apps, searchable using keywords such as "OpenAI" and "ChatGPT" brings up many apps that use identical or very similar titles and logos to OpenAI, the company behind ChatGPT. Reusing similar colors, icons, and OpenAI logos, they give the appearance of being legit, yet are not officially linked to OpenAI at all.
"Most of these apps are nothing but cheap imitations or outright scams that fail to deliver on their promises," writes Kleber. "These scams not only deceive users but also tarnish the reputation of legitimate developers and hinder the growth of the app ecosystem on the macOS platform."
With the use of misleading marketing tricks, as well as deftly used keywords, such apps have become some of the most-downloaded apps in the Mac App Store overall.
Kleber insists Apple should take a "stricter stance" against the apps, which are seemingly passing through the App Review process without issue. In some cases, Kleber spotted approvals by the App Review team as recently as Friday, while still using the OpenAI icon and colors.
Many apps, same creator
In some cases, the apps are extremely similar, with only minor changes to be different enough so it's not a direct clone.
Apps by Pixelsbay and ParallelWorld developer accounts were both found to be running from the same registered address in Pakistan, with the apps sharing 99% of the same code with "slight modifications." Furthermore, the developers even used the same paywall style for both apps, and without a close button in sight.
"This behavior of not providing a close button to the paywalls is highly unethical and can be considered a scam," Kleber asserts. "It puts the users in a frustrating situation where they are forced to either subscribe or forcibly quit the application to regain control of their device."
Digging deeper, it is believed the two apps are linked to another company called Katco, again based in Pakistan and at the same address. References to an email address also indicate the two apps are connected to Katco.
It is reckoned by Kleber that the apps are part of a "larger operation aimed at exploiting the popularity" of AI chatbot apps. "It's alarming to think that such sophisticated and well-coordinated scams can be perpetuated on the MacOS App Store with little to no oversight," they added.
In another discovery, Kleber found one individual who was using eight different developer accounts on the Mac App Store, again for the purposes of spamming the storefront with extremely similar apps.
Scam review tactics
It is also believed that the apps are using abusive tactics to garner positive reviews, which helps push the app in the App Store. In the ParallelWorld app's case, it received more than 175 reviews in a 24-hour period, with 63 from the US Mac App Store.
In this instance, the app requested users to review the app immediately after subscribing to the application, and doing so every time the user makes an OpenAI request, without providing time for the user to actually use the app.
This technique is one that is actually prohibited in the App Review Guidelines and SKStoreReview documentation.
Kleber insists that the activity of scamming developers "creates an unfair and competitive environment for legitimate developers who follow the App Store guidelines, and it goes against the principles of fair competition that Apple strives to uphold."
"By abusing the system, the individuals were creating confusion and clutter on the App Store, making it difficult for users to identify legitimate apps and eroding the trust that users have in the platform."
Apple has a "responsibility" to maintain standards for apps, and to keep a level playing field for all developers, Kleber concludes.
A continuing problem
Scam apps continue to be an ongoing issue for Apple, with unsavory apps managing to slip through the company's checks as their developers come up with new strategies to thwart the system.
In February, con artists were using a so-called "pig butchering" scam to sneak apps into the App Store. First, the app poses as legitimate to reviewers, but after approval and appearing in the App Store, a domain change delivers a fake interface, as part of a sophisticated long-con fraud.
On Apple's side, it is working to stop fraud apps from slipping through. In 2022, it said safety mechanisms stopped nearly $1.5 billion in potentially fraudulent transactions and kept 1.6 million "problematic apps" away from users.
7 Comments
In the absence of a fully bulletproof app review process (as that is not possible), mitigation must be the goal.
In these particular cases more can be done, that's for sure, and some of the checks needed to reduce the flow of bad apps are quite easy to implement. I'm sure App will tighten a few nuts and bolts.
The wider problem of coercion via non-closeable screens etc would be greatly helped via legislation. I wonder if the latest EU consumer protections cover this kind of behaviour. In the absence of that though, I see no reason why it can't be reason enough for the app to be pulled from the App Store even if there is no formal legal complaint filed as a result.
Yet we are always hearing about legitimate developers who try over and over to get their app approved, rejected for seemingly nitpicking features or functions. There was an article not long ago about a reputable developer who finally gave up getting their product approved for the App Store because of the constant rejections. But we have scammers who are apparently gifted in getting their crap approved.
What does that say about Apple’s ‘review’ process? It says it’s bullshit from top to bottom.
Using the term "AI" is already a scam. There's no intelligence involved whatsoever. "ESCP" (extremely sophisticated copy and paste) would be far less scammy.
As soon as I started hearing people who know very little about technology talking about ChatGPT, I knew the scammers were already hard at work to exploit them. I have no doubt there are already investment scams out there using it as a way to get people to invest in "the next big thing". It's the new cryptocurrency.