![]() ![]() ![]() TL:DR Samsung is using AI/ML (neural network trained on 100s of images of the moon) to recover/add the texture of the moon on your moon pictures, and while some think that's your camera's capability, it's actually not. I zoomed in on the monitor showing that image and, guess what, again you see slapped on detail, even in the parts I explicitly clipped (made completely 100% white): To further drive home my point, I blurred the moon even further and clipped the highlights, which means the area which is above 216 in brightness gets clipped to pure white - there's no detail there, just a white blob. If you turn off "scene optimizer", you get the actual picture of the moon, which is a blurry mess (as it should be, given the optics and sensor that are used). Now, Samsung does say "No image overlaying or texture effects are applied when taking a photo, because that would cause similar objects to share the same texture patterns if an object detection were to be confused by the Scene Optimizer.", which might be technically true - you're not applying any texture if you have an AI model that applies the texture as a part of the process, but in reality and without all the tech jargon, that's that's happening. Since the moon is tidally locked to the Earth, it's very easy to train your model on other moon images and just slap that texture when a moon-like thing is detected. In this article, they mention multi-frames, multi-exposures, but the reality is, it's AI doing most of the work, not the optics, the optics aren't capable of resolving the detail that you see. It is adding detail where there is none (in this experiment, it was intentionally removed). This is not the same kind of processing that is done when you're zooming into something else, when those multiple exposures and different data from each frame account to something. And I have to stress this: there's a difference between additional processing a la super-resolution, when multiple frames are combined to recover detail which would otherwise be lost, and this, where you have a specific AI model trained on a set of moon images, in order to recognize the moon and slap on the moon texture on it (when there is no detail to recover in the first place, as in this experiment). In the side-by-side above, I hope you can appreciate that Samsung is leveraging an AI model to put craters and other details on places which were just a blurry mess. To put it into perspective, here is a side by side: This means it's not recoverable, the information is just not there, it's digitally blurred: Īnd a 4x upscaled version so that you can better appreciate the blur: ģ) I full-screened the image on my monitor (showing it at 170x170 pixels, blurred), moved to the other end of the room, and turned off all the lights. I downsized it to 170x170 pixels and applied a gaussian blur, so that all the detail is GONE. I downloaded this high-res image of the moon from the internet. So, while many have tried to prove that Samsung fakes the moon shots, I think nobody succeeded - until now. There have been many threads on this, and many people believe that the moon photos are real ( inputmag) - even MKBHD has claimed in this popular youtube short that the moon is not an overlay, like Huawei has been accused of in the past. ![]() While these images are not necessarily outright fabrications, neither are they entirely genuine. Nevertheless, I've always had doubts about their authenticity, as they appear almost too perfect. Many of us have witnessed the breathtaking moon photos taken with the latest zoom lenses, starting with the S20 Ultra. This post has been updated with several additional experiments in newer posts, which address most comments and clarify what exactly is going on: ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |