Please
don't allow Samsung to use this new technique. I don't want a camera that can sting me before exploding in my face.
Camera designers will get to add a technique borrowed from nature to improve how they handle colour, courtesy of the humble honey bee. Boffins at Australia's RMIT University in Melbourne (with colleagues from Monash and Deakin Universities and the University of Melbourne) looked at how honey bees process colour information, …
Which is quite neat. thumbs up for the work.
BTW Edwin Land did a lot of work on how humans can see the "same" colors under vastly differing types of natural and artificial light, also with a view to making cameras (and films as they were then) do the same.
I'm not sure if he ever solved the design problems.
"I'm not sure if he ever solved the design problems."
Not surprised. AFAIK its a highly complex bit of processing the visual cortext undertakes which works 99.9% of the time but on occasion gets it wrong or you end up with optical illusions such as this:
https://en.wikipedia.org/wiki/Checker_shadow_illusion
It would be interesting to see the world as the eye really sees it, just as a comparison. I suspect it would look rather strange.
"It would be interesting to see the world as the eye really sees it"
I suspect as an incomprehensible mess of light and colour moving around like a kaleidoscope. Think the first few seconds of waking up with a bad hangover.. Not sure how accurate this is but I've heard people who have been blind since birth will never be able to 'see' even if their physical sight is restored (after a certain age when most long-term neural connections have formed).
Indeed I did some work on this with him, but not much. I was part of a laboratory continuing his work. It's actually not as complex neuronal circuit wise as you would think. But it makes a lovely demonstration, changing one colour into another without doing much besides changing how much of a scene is illuminated. Oddly people who go on to develop MS hardly ever saw the colours change.
There are already, they are usually used only for display brightness, not color adjustment.
Some devices do use them to adjust the white balance of the screen but this is generally considered unnecessary as the eye will quite happily do this itself. A lot of cameras also do automatic white balancing otherwise we'd notice much greater shifts between natural and artificial lighting as you used to when working with real film.
But I think the paper is investigating more subtle effects (flowers look very different to insects than they do to us) such as IR levels.
"A lot of cameras also do automatic white balancing otherwise we'd notice much greater shifts between natural and artificial lighting as you used to when working with real film."
When working with real film printers used to do automatic white balancing based on their assumptions as to what balance the picture should have. Different printers could make complete but different pigs ears out of unexpected subjects such as soil profiles.
No. You have to get the color balance for the scene you're taking a picture of, not the place you're shooting from. If you're in a shadow in a clear day, your sensor will read a high (blue) color temperature, but if the subject is in the sunlight, it will be lower. Or viceversa.
Guess bees have different needs...
Bees are very short-sighted. So they would be typically very near the flower when using their eyes to recognize it. In that case, the white balance at the eye position is very likely to be similar to the white balance at the object. (flower)
This doesn't work so great if you want to see something at a distance, which probably explains why you don't have three oculi on your head.
"Bees are very short-sighted. So they would be typically very near the flower when using their eyes to recognize it."
They have two distinct requirements, one to see the flower from some distance in order to fly to it and the other to navigate the actual structure of the flower.
The plant world, having co-evolved with bees, tends to help with this. Insect pollinated flowers, for instance, are bright coloured whilst wind-pollinated flowers are usually just green. Massing many small flowers to produce a large target is another Compositae and Umbelliferae Asteraceae and Apiaceae being examples. This means it doesn't matter if the bee can't focus well.
Larger flowers can also have distinct* markings to help the bees navigate the flowers' structure.
*Distinct to bees, they might only be visible in the UV.
I keep bees, and while they don't talk to me about their eyesight specifically, I must assume they possess pretty good at-a-distance vision as well as short-range, they find their own hive with amazing accuracy, and they do that by sight. If you were move the bee hive by a couple of feet then most of them wouldn't find it their way home any more.
"I keep bees, and while they don't talk to me about their eyesight specifically, I must assume they possess pretty good at-a-distance vision as well as short-range, they find their own hive with amazing accuracy, and they do that by sight. If you were move the bee hive by a couple of feet then most of them wouldn't find it their way home any more."
Obvious init. GPS.
they can smell, and even sense direction to discovered food sources - I'm not sure scientist have figured out what kind of "GPS" this is, but the workers do a dance when they come from a new source of food, and other workers can tell what direction to take from the movements - Also I wonder if this behavior lets them get a whiff of the pollen on the bee's storage pockets, so they also know what kind of pollen they will detect.
I've seen nature videos showing bees recognizing an enemy by color and from several yards - and they claim it is visual alone. I'd imagine you wouldn't want to look, smell like, or wear any black or brown shirts, or they may get the idea you are a bear, their worst natural enemy next to fire. I really wonder why smoke calms them down so much - you would think it would be like the biggest panic you could think of. I guess that is one of nature's greatest mysteries?
What you say is correct, but generally if a bee is harvesting pollen it is up close and personal with the flower -so differences in lighting twix bee and flower are unlikely. The whole idea of a separate ambient light sensor on cameras, and camera light meters, has been done - even to the extent of colour sensing. As you point out it only works if you assume the subject and the camera sensor are seeing more or less the same sources - so studio work only?
'Simple' AWB (which assumes the brightest tri-channel reading is white and adjusts accordingly) works well for most images. Colour constancy (based on Land's, and others', work is achieveable with more sophisticated algorithms and probably takes care of 99.9% of situations. Since everyone perceives colour slightly differently, irrespective of any colour 'blindness', the subtleties are probably only important to people working in product photography and textiles.
"never take photos towards the sun"
Taking pictures toward the sun is such a well-established pictorial technique that it has it's own name: contre jour.
When your deliberately trying to get a particular effect, ie totally hidden features of person and backlit by the sun, yes.
When the ignorant are trying to get the cheap crappy camera on a smartphone to try and get a decent photograph of facial features in their selfie in spite of the utterly appalling lighting caused by user ignorance of photography fundementals, no.
Upvoted because I think he's right. The Peter2 Principle "never take photos towards the sun" will save more photos than it loses. Of course, now that photos don't cost you anything, anything goes.
The odd 'contre jour' that works is often a lucky edge case of the camera auto exposure,auto focus system rather than a knowledgeable photographer manually setting his camera to compensate for the unusual lighting and wanted exposure conditions. In my experience most photos are point and shoot snapshots.
Actually, one of the basics of photography (and many other visual arts) is learning how light behaves. It's easier than many people think, because light always plays by the rules. and I'm sure bees understood it millions of years ago.
The "beginners advice" which was printed on film boxes where aimed at the Kodak's "you press the button, we do the rest" drones. You had no control on the exposure, the lenses were poor (i.e. no coating), so you needed to avoid situations nor the camera nor the film could cope with.
Us humans are quite crap, really.
I feel shortchanged - not only can they see ultraviolet - they can do it in perfect clarity and shading too.
Of course, they can't sting other creatures without generally dying. So they're not perfect.
I wonder if wasps share a similar visual ability. Bastards that they are.
We also have auto white balance (this is what it is).
We do not see ultraviolet because it would damage our eyes long term (filtered out by the lens).
But some of us (I would suggest the majority of ElReg readers) have one thing other animals do not. A superb brain.
So we have good colour vision for a mammal, average night vision (due to good colour), excellent front limb control, OK hearing, poor smell, OK taste. Average poison resistance (rabbits can eat things which would make us ill, we eat stuff poisonous to dogs.) Then we do have hot weather endurance due to sweating.
> poor smell
I am not too sure about that. In one of Richard Feynman's books, he explains how he tested (informally) that assertion by having his wife pick a random book from a bookshelf, then replacing it. He then went back into the room and sniffed out the correct book. His hypothesis being that, rather than being bad, we're just out of practice.
Bees' barbed stings only get caught in elastic mammalian flesh.
They are designed to cause complete havoc on other insects, both predatory and scavenging. For example, a wasp, hornet, or other bee attempting to raid the hive for pre-made honey will find that the defending bees' stings come out very easily; along with chunks of scavenging bee!
but it's not like camera manufactures or photographers don't understand the theory of colour balance. As LDS says the tricky part is to know what to balance for (the light falling on the scene) and I doubt the bees can help there. (The traditional method is to have an attractive assistant hold a grey card in the appropriate spot. I doubt that the bees could help with that either.)
Ok, not the same as bee-vision but worth a look if these things pique your interest...
Jumping spiders have pretty acute vision, but their lenses are fixed to their exoskeletons so if they want to look around they can't just swivel their eyes. Instead they move their retinas behind the lens. If you look a jumping spider in the eye, and the lens turns dark, you know it's looking back at you.
The video on this site shows a jumping spider with a largely transparent exoskeleton, so you can see the retinas moving around - it's amazing. And the page contains a link to another video which uses the spider's lens itself to focus on the retina, showing incredible detail.
http://webvision.med.utah.edu/2014/07/moving-jumping-spider-retinas/
It certainly is interesting, so thumbs up.
However knowing when a spider is looking back at me will now just make the creepy little bastard's that bit more horrific.
"He who fights with monsters should look to it that he himself does not become a monster. And if you gaze long into an abyss, the abyss also gazes into you."Friedrich Nietzsche
"Also don't some spiders have a neat trick where they wobble their low-res eyes around, and use the light/dark/light transitions thus caused to build up a much higher resolution image than a crude 9x9 grid would normally give you ?"
That's also similar to humans. We only see a very small cone n sharp focus so the eye is always moving slightly and the brain builds a composite image from that. Sort of.
...I don't want my camera to produce the same colour under different ambient light. Half the challenge of creating an interesting photo is to get different colour balances on foreground and mid ground objects against different skies. My photos would look weird and unnatural if for example a bridge and a river were the same colour no matter what the sky background was (for example: overcast, sunset, bright day with blue sky, etc.)
The Nikon D2H (in 2003) had an ambient light sensor on the prism housing, that did exactly this.
Problem is, you need the correct colours for the subject of picture, not where the camera is. The camera might be in shade, the flower/animal/person your're photographing might not. Later cameras didn't have the sensor... go figure. It was useful to measure a sort of average for the whole environment, but wasn't practical.
When will they work on a mobile phone camera that will slap sense into the idiot user when they try to take video in portrait mode?
I hate it when people do that. Just one example:
A friend was showing off some holiday videos on his TV. The TV is of course wide-screen but the video is portrait, when asked why he chose to record in portrait when he intends to make a DVD for use on a wide-screen TV his reply was "Well, that's how my phone takes video".
Cue facepalm.
Sounds like something his ex-wife would do (tearing it off the non-rotating wall mount in the process). Mind you, I once argued with her for over an hour that people cannot catch viruses from an infected computer, something she still believes in wholeheartedly, alongside fairy power and guardian spirits.
Even rotating using the DVD player menus still wouldn't work however, what with him having made a DVD meaning it then had permanent black bars at either side to fill the aspect ratio - same thing as what happens when YouTube re-encodes portrait videos to 16:9 using standard settings but idiots persist in doing it anyway.