Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

It looks like Samsung is cheating on 'space zoom' moon photos

Samsung S23 Ultra

Last updated

Moon photos taken with the "Space Zoom" of Samsung's flagship smartphone models appear to be more a feat of AI trickery than anything else, a Reddit user's investigation into the feature claims.

Samsung's flagship Galaxy smartphone lineup, including the Galaxy S23 Ultra, has an extremely high level of zoom for the rear cameras. With a 100x zoom level, created by augmenting 3x and 10x telephoto cameras with a digital zoom aided by Samsung's AI Super Resolution technology, it can capture shots of things very far away.

That so-called Space Zoom could potentially allow users to photograph the moon, and many do. However, it may be the case that the level of detail in the moon shots may only be higher due to software shenanigans.

In Friday's post to the Android subreddit, "u/ibreakphotos" declared that Samsung's Space Zoom "moon shots are fake," and that they had proof. The lengthy post then demonstrates that belief, in a fairly convincing way.

Referring to previous reporting that the moon photographs from the S20 Ultra and later models are real and not faked, the Redditor points out that no-one has managed to succeed in proving that they are real or fake, until their post.

The user tested the effect by downloading a high-resolution image of the moon, then downsized it to a 170 by 170-resolution image, and then applied a gaussian blur to obliterate any final details of its surface.

They then showed the low-res blurry moon at full screen on their monitor, walked to the other end of their room, zoomed in on the fake celestial body, and took a photograph. After some processing, an image of the moon was produced by the smartphone, but the surface had considerably more detail for the surface than the doctored source.

The low-res and blurry source image of the moon (left), and what Samsung's smartphone processed it as (right) [Reddit u/ibreakphotos] The low-res and blurry source image of the moon (left), and what Samsung's smartphone processed it as (right) [Reddit u/ibreakphotos]

The user reckons Samsung "is leveraging an AI model to put craters and other details on places which were just a blurry mess." They go further to stress that while super resolution processing uses multiple images to recover otherwise-lost detail, this seems to be something different.

It is proposed that this is a case "where you have a specific AI model trained on a set of moon images, in order to recognize the moon and slap on the moon texture on it."

"This is not the same kind of processing that is done when you're zooming into something else, when those multiple exposures and different data from each frame account to something," they propose. "This is specific to the moon."

It is reckoned that since the moon is tidally locked to Earth, "it's very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected," and that the AI is "doing most of the work, not the optics."

Referencing to an earlier failed attempt to bust Space Zoom's quality, Samsung assured that the feature used up to 20 pictures, then processed them as a composite with AI. That AI identifies the content of the scene, and then performs a "detail enhancing function" on the subject.

At the time of a previous investigation in 2021, attempts to trigger an overlay or AI processing on a clove of garlic on a black background or a table tennis ball failed to trick the smartphone. The 2023 test using a 170-by-170 resolution image of the real moon may have given the AI processing just enough basic detail to make it think it was looking at the actual moon.

The new test also eliminates any sort of multi-frame sharpening from being used, since it's a shot of the same low-resolution moon for every frame.

It remains to be seen if this brief investigation will trigger closer scrutiny at the use of AI in photography, but the concept is one that has been employed across the entire mobile industry. Even Apple leans on computational photography to improve the quality of images from its cameras.

While the public may be convinced that AI processing techniques being applied to images from smartphone cameras is a good thing in general, oddly specific instances such as this may cause some pause for people who care about photography as an artform.



43 Comments

Xed 2896 comments · 4 Years

You may as well just use Google Images instead of taking your own photos.

DAalseth 3066 comments · 6 Years

The general rule of thumb in Astronomy is an optical telescope is limited to 50x per inch of aperture. (Not counting adaptive optics etc., which is not a factor here). The lens on this thing is what, a quarter inch? (Being generous). That means that any optical zoom over 12 is not going to do anything but fuzz. It’s like those department store telescopes that promise 300-400 power out of a one inch lens. Not gonna happen. You can pile on the lenses, but it won’t do you any good. 

So Samsung decided to sweeten the image with file photos, and claim absurd magnification numbers.

They put the scum in Samscum.

chasm 3620 comments · 10 Years

Wait, WHAT? Samsung CHEATING to make their products look like they perform better than they do??

WELL I NEVER!!  :D

bohuj 1 comment · 1 Year

Bullshit of Apple fans. It easy to confirm its not cheating, takę an s22 or s23 ultra and using 100x zoom takę a Photo of any lamppost. It works same As moon shot u will See all details inside the lamp like bulb or Led array, wires etc. 

lkrupp 10521 comments · 19 Years

bohuj said:
Bullshit of Apple fans. It easy to confirm its not cheating, takę an s22 or s23 ultra and using 100x zoom takę a Photo of any lamppost. It works same As moon shot u will See all details inside the lamp like bulb or Led array, wires etc. 

Ooo, a Samsung bot on patrol.