Who cares?
Sure I care!
If they had any intelligence in that AI, then the improved Moon would show as the Big Cheese.
Never too late to let the model relearn all the goodies.
By now most in the Anglosphere must have seen that Samsung ad for the Galaxy S23 Ultra where a woman snaps a detailed photo of the Moon – craters and all – moving her telescope-toting neighbor to ask: "Mia, can you send me that?" Youtube Video Well, here comes Reddit to spoil the fun. A post last week on the r/Android …
BFD. If you don't want AI proessing, just turn it off[1]:
[ Overview of moon photography]
Since the Galaxy S10, AI technol ogy has been applied to the camera so that users can take the best photos regardless of time and place.
To this end, we have developed the Scene Optimizer function, which helps AI to recognize the subject to be photographed and derive the optimal result.
From Galaxy S21, even when you take a picture of the moon, ai recognizes the target as the moon through learned data, and multi-frame synthesis and deep learning-based ai technology when shooting. The detail improvement engine function that makes the picture clearer has been applied.
Users who want photos as they are without AI technology can disable the optimum shooting function for each scene.
[1] https://r1-community-samsung-com.translate.goog/t5/camcyclopedia/%EB%8B%AC-%EC%B4%AC%EC%98%81/ba-p/19202094?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en-GB
It's one of the most obvious targets for this tech... it's reasonably easy to check that the think looks like the moon, and since the same face is always pointing towards us... you know what it's going to look like. The phone knows the time, the location, the orientation of the phone... and the available high res imagery of the moon at all phases is pretty good.
Got to be one of the easiest AI tasks ever... but it is pretty deceitful in many ways, and it risks the next dataset being trained on what an AI thought the moon should look like...
I'm sure it's not just a 'test' for a wider rollout of this kind of thing.
I mean who'd ever need to be able to AI identify common object in photographs? What use could that be?
Then one day it just so happens that every photo you take that includes (say) a soft drink can now has the logo facing the camera, or the competitors drink replaced...
But noone would ever think of that would they?
It is the continuous feeding in of AI enhanced images mixed with real ones to the training set of future AI image systems that could get interesting.
Sort of like when someone doesn't mute on Teams when they are using a loudspeaker and you get the feedback echo that goes on and on but which alters its frequency distribution over time.
"I am sitting in a room" indeed.
Once this becomes standard our lizard overlords will be able to move freely on our side of the moon once more, safe from any intrepid photographer trying to capture their nefarious schemes. Think you've spotted something fishy on the moon? Take a photo to prove it to everyone... nope, just a normal picture of the moon!
This post has been deleted by its author
Not necessarily. Do you use a tripod to photograph a flower, a dog or a building? Certainly not essentially. People forget that the Moon is just another well-illuminated sunlit object that doesn't require any remarkably different treatment to anything else similarly lit, and if hand-held (+IBIS/OIS if available) is enough for a sharp image, there's no reason for extra clobber. The reason why most newbie Moon photos end up hopelessly overexposed is mostly because they use auto-exposure based on a frame that's 99% dark. Spot metering is all it takes, and yes, bracketing provides a useful insurance policy!
With 125 ASA film, the moon's correct exposure is of course 1/125th at f16.
Call me old fashioned, but I expect a camera to produce an image of the light that falls on its image sensor, not some statistical average of what you photographed ought to look like!
"Wait, that's not Mavis in the wedding photo!"
The 'Sunny 16 rule' for photography is that for a subject in direct sunlight the correct exposure is f 16 at 1/(Film speed in ASA or ISO).
So for the Moon and planets in the solar system, an accurate exposure is actually quite short, as Mr Barnes has stated above.
https://expertphotography.com/photography-101-sunny-16-rule/
> Spot metering is all it takes
FWIW even using selective spot-metering I find that it helps to dial in -1 or even -2 underexposure as well, as you (well, I) don't really want even the brightest area to be saturated (and bracket around that exposure).
> Do you use a tripod to photograph a flower, a dog or a building?
Buildings, yes. With a decent mechanical head on the tripod to get it lined up nicely (Manfrotto 410 for ease of carrying around)
If the flower was about 0.5 degrees across and filling the frame, then again, yes, as that would be (quickly working from a picture of the Sun, which is - surprise - the same angular size) 1400mm focal length on my four-thirds camera or 2800mm equivalent for a 35mm ("full") frame. To shoot that hand-held, even with IBIS & OIS, is - tricky.! I'm willing to give it a shot, though, if you'll get me the M.Zuiko 150-400mm plus the 2x teleconverter, which has that sort of reach (at the cost of max aperture of, um, f8 or even f11 with both teleconverters in; rule of thumb, minimum handheld shutter speed is the 35-mm equivalent focal length times 1.5 (worse as you get older), but get back say 3 stops from the combo OIS & IBIS with my EM-1 (they promise 5 stops but, yeah), so say 1/500th second. That is just in the realms of doable but dang, that lens is heavy, I'm using a tripod for that flower.
I'm an amateur photographer since the age of 10 and an amateur astronomer since the age of 10.
I know that tiny lenses and focal lengths wont beat a telescope and a camera attached. This advert aims to suggest it will, which is simply laughable.
Other people wont know its crazy, and will think this is real and look to buy these devices and then find they have buyers remorse when it doesn't do what its supposed to do.
Anyone who asks "Who cares" must hopefully be trying the same trick as on Reddit, or is totally oblivious to the thing that does care, the LAW.
We have trading standards laws explicitly created to actually prevent thins kind of false advertising and the way Samsung get around it is by putting up tiny barely legible text that everyone ignores saying "image is simulated".
It reminds me of that other phone that claims to have more megapixels than my old DSLR but it turns out its only got half as much and the other pixels are interpolated. Again, there is a statement, a much clearer one in that case that tells you as such.
Basically, if its too good to be true, it is. Read the small print, save yourself being a mug.
> I know that tiny lenses and focal lengths wont beat a telescope and a camera attached. This advert aims to suggest it will, which is simply laughable.
Oh, come of it - the ad shows an image of the entire Moon that can fill a 'phone screen - this is not a major achievement in the world of photography.
Using a camera and telescope, an amateur astronomer would be embarrassed if the image only scaled up to a few inches across: our January AstroSoc speaker, by our local amateur astrophotographer, presented us with an image of Mars being occulted by the limb of the Moon, with Mars being quite recognisable as such. He also has a whole-Moon shot taken at the same scale.
I can get a moon picture that fills the screen. But it requires attaching my phone to the side of my telescope (a little 4" jobby) and lining up everything just right.
You're not going to get a full screen native shout out of a phone because it lacks hardware zoom, the best you'll get is digital zooming and that'll look progressively crappier as you zoom in.
Actually... My real camera has 12x optical zoom and some more digital (to make 64x in total at max) with the lens sticking far out the front, and I don't think even that's enough to make the moon fill the shot (but it's a old camera so great shots in the day, sucks at night).
> has 12x optical zoom ... I don't think even that's enough to make the moon fill the shot
Unfortunately, a spec like "12x" really doesn't mean anything on its own, without knowing at least the field of view at one end of the range, so sadly it isn't enough information to judge against anything.
FWIW, I just saw the ad on telly again - when she zooms in, the image is actually only half the width of the phone screen, held vertically - so that is about 1080/2, or 540 pixels. Presumably taken with the 10 MP (Telephoto) rear camera - can never get the proper specs for these cameras, so let's assume it is the same aspect ratio as the screen and therefore approximately 2240 x 4480, so the 540 pixels is one quarter of the field of view of that camera. The camera is stated to have a 10x optical zoom.
Now, that is not enough information to decide if the camera can make the Moon a whole 540 pixels across - need to know the fov or calculate it from the sensor size and lens focal length - but by the increasing number of downvotes I'm getting here, at least one of you lot can manage to dig out the relevant information to show how absurd the whole idea is. Yes, that is a challenge - show you working of STFU.
> You're not going to get a full screen native shout out of a phone because it lacks hardware zoom
Optical ("hardware") zoom has been a thing in 'phone cameras for a while, hasn't it? As just said, this Samsung 'phone is being quoted as having a 10x optical on one lens and a 3x optical on one of the other lenses.
Optical ("hardware") zoom has been a thing in 'phone cameras for a while, hasn't it? As just said, this Samsung 'phone is being quoted as having a 10x optical on one lens and a 3x optical on one of the other lenses.Modern mobile phones aren't thick enough to fit the moving lens groups required for zoom lenses (whether in-line or folded), so if they claim an optical zoom range they're more likely just two or three different sensors with two or three different lenses in front of them. They might even transition smoothly between them (and you can tell when the perspective changes slightly as you "zoom"), but everything in between those two or three focal lengths is just plain old digital zoom.
My phone doesn't even bother with that. It simply has three field of view settings named "0.6x", "1x" and "2x". The first and second are individual units, but the third is just a crop of the second.
Sadly true - the 2016 Asus Zenfone could have been a harbinger of good times: if 'phones weren't all so thin we could have mechanics for the camera plus other niceties, such as - big, removable batteries; something to actually hold onto, comfortably, when it isn't in fold-over case; thick, robust connectors with water-resistant flaps, even for the headphone socket; a headphone socket!
https://www.hardwarezone.com.sg/tech-news-asus-zenfone-zoom-has-3x-optical-zoom-10-element-hoya-lens-13mp-rear-camera
Not that they aren't trying:
https://www.dpreview.com/news/3043887854/o-film-demonstrates-smartphone-camera-module-with-85-170mm-equivalent-optical-zoom
https://www.dpreview.com/news/9056841066/samsung-starts-mass-production-of-5x-tele-smartphone-camera-module
For planetary work, stacking is an absolute necessity. Deep space you can get away with plain old long exposures - but go too long and stacking is needed purely to get the exposure times required (one night is simply too short).
But for the Moon, you don't really need stacking. You can fire off the multiple frames, at full resolution, and choose the one with the best seeing, but unless you are doing something extreme, stacking is a lot of work for not much payoff.
Now, taking advantage of the size of the Moon and the detail available, doing massive grid shots across the whole face (prepare to spend months on this one) is a great thing to do - and you will then run some of the same software used for stacking to do the de-rotation, scaling and alignment to put the whole super-res image together. This is what we were shown the results of in our January AstroSoc meeting and, wow, socks were liberally blown off. I could only ever hope to have that much patience!
Samsung are using this one example, of the Moon photos, to imply their camera/phone is amazing for all photos, and this is being used to have a lot of people spend a lot of money on Samsung phones.
In fact, the amazing photos of the Moon work for and only for the Moon.
This to my eye has crossed the line between marketing and lies, in part because the actual truth is so different to reality, and in part because the actual consequences of this, helping to deceive a lot of people into making an expensive purchase, are of considerable magnitude.
Also, there is a difference between "taking a photograph", "enhancing the image", and "making stuff up".
If you want to use a photo in court, you really want to just take a photo. Minor automatic enhancements, such as brightness and contrast, are fine.
But this is an AI that is inventing details that are just not there. That is not reliable evidence to use in court.
Got a photo of a hit and run? Now, how do you know if the camera actually caught the correct registration number, or if the registration was an unreadable blur so the AI just made something up? How do you know if the drivers face was captured correctly, or if the AI decided to "fix" the glare on the windshield by inserting a random AI generated face? Did the AI misinterpret a smudge of dirt on the car as paintwork damage, and hence invent details of paintwork damage that are just not there, and can be proven to not match the car the police impounded?
The camera never lies, they say. That may be true, but what comes out of the camera and what people see afterwards are completely different things.
-> Samsung's marketing is deceptive.
Stage 1. Pretty much ALL "professional" food photography is faked. The "photographers" use everything from paint to shaving cream to make food look better than it is. So the camera has not captured the food, it has captured what the photographer wanted it to capture. That ugly looking hamburger the photographer starts with as a source prop, if indeed it is a hamburger at all, suddenly looks a lot more tasty.
Stage 2. The photos go into Photoshop. Adjust this, adjust that, do something with the curves. It isn't just done to create a look or to "correct" errors, it is done to make the product seem better than it is.
I've picked on food photography as it is so easy to find examples on YouTube showing what I have listed above.
Marketing is deceptive, eh? Who would have thought it. "At a steady 56mph this car does x to the gallon". Remember those adverts? They were egregious in their disinformation.
My Xiaomi phone's camera app has a mode where it can recognise people and "tweak" boob size, butt size, and a bunch of other parameters.
Part of my thinks it's cool that cameras are powerful enough to seamlessly manipulate the image in real time, but a larger part of me thinks it's shitty that some people are so insecure in themselves that they'd feel the need to use this.
Go into your locally owned restaurant of choice that has pictures of menu items on the wall / in the menu / on their web site / on Facebook and you will get pretty much exactly what is pictured. If they don't have good QC on presentation it may not look as tidy, but none of those guys are using non food materials to make their pictures look better or using Photoshop to "improve" it.
They are just asking their chef to prepare an example menu item during post lunch downtime, placing it on a table or bartop where there's halfway decent lighting, and whichever manager is the social media maven snaps a picture with whatever smartphone they happen to own. Sure, probably the chef will pay a bit more care to getting it looking its best especially in plating than you'll get from what you order during a rush. But it is not even remotely like Scamsung's moon picture fraud.
Once the photo is taken a server who has been without a break since clocking in at 10am will wolf it down as their break meal before the happy hour crowd starts showing up.
The camera has been lying since day one.
Ever since images have been captured, people have been looking for ways to influence the result to their need or desire.
And with digital, it has simply exploded.
Including some cloud manipulation is par for the course these days.
Samsung is lying, bears in the woods, call me when something new is actually happening . . .
The difference here mate is in "stage 1" the object being photographed is already in the completed state. Thus the camera recorded the truth of the state of the prop.
To compare against this Samsung thing, the prop should be replaced by a CGI version by AI at the point the photo was taken. This CGI version would add new features, ingredients that were not part of the original prop.
Stage 2 is nothing important. It is done in the darkroom also.
The question is the same as with any photo competition. DO you take a photo of a prop/person/scene with or without manual modification, or do you submit a COMPOSITION of a photo plus generated imagery?
The samsung is creating compositions. It is not enhancing an image but inventing an image made to appear as an enhancement. This is a very different type of photography.
The camera does not lie, that is true. The developed final print however does. Cameras shoot a scene, capturing light and recording that fact. The original image is always the truth, even those suffering from unintended optical effects, it is what the camera saw, what the lens saw. After the shot anything can happen from enhancement to the creation of a composition, a lie, as that was not the shot you took. Here however, the phone, the camera is actually lying.
Your point is flawed as I dont think you are aware exactly what is happening here.
There is no single “moon,” so the ML-assembled photo is misrepresenting the fact that the moon displays libration, up to 8° in longitude (from ellipticity in the moon’s orbit) and nearly 7° in latitude (from the inclination of the moon’s axis of rotation to the plane of that orbit). Except, of course, the evil geniuses at Samsung don’t know anything about the moon other than that there are lots of bona fide images of it. Prats.
A quick Google will confirm there are many photography advice columns that say, if you want a foreground scene with a good moon in the background, you'll have to do a composite shot (in other words fake It by mixing a different moon picture into the one you took of the foreground). All this is doing is automating that trick.
"a foreground scene with a good moon in the background"
Are we watching the same telly ad? Or are we just wandering along with the crowd, crowing it is all lies and deceit?
The ad you're supposed to be screaming about only has the Moon, nothing in the foreground!
I wasn't talking about the advert, I was talking about the function it was advertising. I assume that a function to enhance the moon in a picture was not intended for pictures of just the moon by itself, but for night scenes that have a moon in the background. The fact that their sales department messed this up should not alter our reasoning about a "blurred moon to fake moon" function that is intended to be used for night scenes, not pure moon shots. I believe it's also possible to turn it off when you don't want it.
Maybe, but what if you want to use the pictures your phone takes as evidence? That might be a little tricky, in a sufficiently hostile environment, if everything you photo is silently auto-furtled in the background, allowing it to be dismissed as "tampered with"...
You are talking about exposure bracketing, essentially combining the light from two shots to compensate for the relative lack of dynamic range of cameras.
Not taking a shot and pasting on a fake moon afterwards as Samsuing are doing.
You need to set up the shot in a sensible way - read up what Randall Munroe wrote in (I think) "How To". Basically to have the "super moon" behind the skyline you need to travel a long way and use a tele lens to zoom close. That is the proper way.
This ad basically implies that the camera is so good it can really zoom in a hundredfold - which is not the case. What the algorithm behind it can do is fill in the details for a single specific object (the moon), which helps zilch with zooming in at, say, a rare bird you happened to spot on a twig on a branch on a tree in a bog down in the valley-oh.
I confess to getting a giggle from all the photography nerds discussing the impossibility of such photos as if the 5 stop range of film applied to digital cameras as well. It doesn't. The most spectacular shots I have taken in the last few years were not on my Nikon (which has been effectively retired) but on first a Pixel 4 and then a Pixel 6. What made them spectacular was the phones ability to work in a range of lighting in dawn, dusk or night conditions. Detail in the shadows. Getting a good shot of the moon is not hard if you switch to manual. Getting a good shot of the moon and what ever was going on that made it interesting, is a real challenge to even expert photographers, but is something that digital phones can do quite well.
Which is fine. But it's still quite different from your camera effectively saying, "It looks like you're taking a picture of the moon. That's been done a lot before, and with better cameras: let me download a better photo and give you that instead (without even telling you)."
Back in the day we had to do our image manipulation in the darkroom. For the Moon it was routine to dodge in a larger one, otherwise it would look "unnaturally" small in the print. I read an explanation that the human brain somehow "magnifies" the visible Moon and when shown a photo of a "real" size Moon is disappointed.
Agree Samsung's ad crosses a line. Even the disclaimer is deceptive! It could refer to the image in the ad or the image in the phone, depending on which one suits the lawyers.
As for how many images were needed to train this thing, the lower limit is one. Has anybody checked to see it isn't returning the identical shot every time, simply adjusted for size and phase? Of course, that wouldn't be the "correct" way to cheat. :)
I can imagine a lot of edge cases where this will fall on its face: Moon partially behind a cloud, someone shining a torch/flashlight. Can it handle two Moons, e.g one reflected on water?
This post has been deleted by its author
Exactly. Yes, part of the problem here is that Samsung is being deceptive in its marketing; but the bigger part is the idea that an altered photograph is inherently a better photograph.
I can find all the nice photos, real and faked, of the moon that I like online. I don't need a phone that lies to me in order to get one. And the same goes for any other pictures I take with any device.
There's an obnoxious advertisement for the Google Pixel Whatever that plays on some streaming service we have, where they show it "removing distractions", i.e. other people, from photos. What the everlovin' fuck is wrong with these developers? What sort of vile, narcissistic relationship do you have to have with the world to believe that other people are mere "distractions" from the fantasy you want to construct with your faked photography?
I don't find Richard's argument convincing in the slightest. I don't believe anyone needs "better" photographs achieved through fakery. Grow up and learn to live in the real goddamned world.
There's always the idea that this kind of image replacement tech could be used for other purposes. Say you go on holiday and the background of one of your holiday snaps includes a famous landmark building. A dialog pops up saying that if you don't pay the fee to use the image of that building, the camera will remove it from the image. That's going to happen. Take a selfie with your favourite celeb, pay t the fee to capture their image, or you're on your own in the shot. Government don't want you to capture evidence of wrongdoing? Well you're not going to be able to photograph that with your phone.
Or even worse: Future AI-infused CCTVs showing political enemies do things they didn't do, for Zersetzung purposes (https://en.wikipedia.org/wiki/Zersetzung)
But hey, at least AI in future phone cameras might be able to embiggen dick pics! So overall a net win, isn't it?
So now if someone takes a picture with such a camera that winds up as evidence in a court case, all the defendant has to do is say "Hey that camera has AI that MIGHT have modified that picture. So it can't be trusted as probative evidence!"
A killer who's caught in a background shot can get off: All they have to do is argue that the cellphone was designed to let the AI mess with the image. Similarly, any authorities caught on cellphone video doing something wrong can similarly argue that the cellphone evidence can't be trusted and must be thrown out.
In recent years the ubiquity of cellphones and cellphone video has caused massive advances in exposing abuse of power: The effect of this new cellphone technology will be to seriously impair these advances going forwards.
To be fair, photographic evidence is already highly suspect. I don't think Samsung are contributing much to that problem.
(A wider problem is that all evidence is suspect, or should be. Can't trust still photography, video, audio recordings. There's a ton of research showing eyewitness accounts are crap. Some forensic science is complete rubbish, such as bite identification and facial reconstruction; the good stuff is highly susceptible to abuse, as various instances of tampering by labs, placing of evidence, and other cases have shown. But this has always been a problem.)
False marketing is illegal, not inly does it mislead buyers but more importantly people who invested after seeing your camera technology have been lied to. Its obvious a trillion dollar corporation who has done such a huge crime is laying people to say: “who cares”. Every small tech reviewer in the past week has said “who cares”. They are taking advantage of language barriers to the full degree, all of infian youtube is covered with, oh must be apple fans who are sad about it.
This is a huge crime
This seemed like a trivial enhancement at first. But selling the sizzle that you will enjoy arcsecond resolution with the camera by showing enhanced pictures of really only one of a few objects that will benefit from the algorithm seems on reflection like a con (but I comment as someone who has never owned any kind of mobile phone, so I'm probably wrong (but at least unbiased)).
"We're not sure anyone with even the slightest technical nous would honestly believe that a smartphone camera sensor is capable of capturing the Moon the same way our eyes do – and if they did, would it matter?"
Depends on how much they know about photography, doesn't it? And, yes, it matters. It's not an actual photograph if various things are pasted in as substitutes for what you are trying to capture.
"In the ad mentioned above, the woman takes a photo of the Moon and it looks good, not the bright white blob you'd normally expect on a phone camera. It doesn't make claims about the power of the lens or bajillions of megapixels or AI/ML trickery. All it suggests is that it takes nice photos – so much so that world+dog will be asking you to send them that."
Well, it's not a "nice photo". It's just fake. It's pointless as well.
If a camera makes me look like some movie star, it's fake. No matter how much the world+dog asks me to send it to them.