Complex CAPTCHA
https://me.me/i/google-is-really-getting-serious-with-their-al-training-select-10656732
Analyst firm Gartner has advised in favour of the use of CAPTCHAs — but recommends using the least-annoying CAPTCHAs you can find. The firm’s opinion is contained in a post by senior director analyst Akif Khan, who noted that CAPTCHAs create friction for humans but remain an imperfect defence against bots. Despite all this, …
Saying as I have no idea who Sarah Connor is, the only way I can solve that is to do a Google image search to see what Sarah Connor looks like. The third result is that very photo.
I'm pretty sure a computer could do that faster than a human. Then it is just a case of picking out the squares that include a picture of a human. Photoshop got the ability to do that about 10 years ago, so I'm pretty sure that would again be no problem for a computer.
I haven't looked, but evidently it's a joke that you didn't get. In the "Terminator" film series, robots exterminate humanity, John Connor leads the resistance, and an Arnold Schwarzenegger robot is sent back in time to kill Sarah Connor, who is his mother, before John is born. Then next time, Arnold Schwarzenegger is a goodie robot fighting the baddie robots.
So, do not help the robots to find Sarah Connor.
yeah the ambiguous squares are the ones that get you a "try again" half the time, along with the faded/shadowy/grainy/too-tiny-too-see ones that might contain bicycles...
so it's "roll the dice" and hope the next one (after the 'try again') isn't one of those "slooowly fade out, slooowly fade in" types that *REALLY* irritate me...
(guess how I voted - see icon)
A reasonably simple solution for smaller sites is to add an invisible field (e.g. a passwordConfirm text field or acceptAlllCookies checkbox) to forms, and then throw away any submissions where the invisible field is not the default value. This generally works because robots don't parse the css.
It is possible to make this method screen reader friendly by defining the tab order for input elements on the form. If you ensure that the submit button is before the spam check field, you can make it so that a person tabbing through the form will reach the submit button before the spam check field. So many won't even realise the spam field is there. Also, setting tabindex="-1" should remove the field from the default navigation flow. All this may be dependant on browser / screen reader, but I've used in the past and it seems to work well.
Also, I find that you can make the label for the spam input element something like 'Bot check, please leave blank' and it is still not picked up by 90% of email bots.
Another trick is to keep the time the page was rendered in the session and when the form is submitted, check how long it was between page render and submission. Bots tend to submit far quicker than any person can type, so if the submission was in under 4 seconds (or something like that - depends on the form), you can normally safely ignore it. Personally I include the time to submit in the posted content, so I can tweak it if some spam is getting through. Normally a 4 to 5 second cut off works.
Between these two methods, I can normally filter 99% of spam submissions without a captcha.
"Because another thing that doesn't parse CSS is a screen reader for the blind. Such a system would auto-determine any such user is no longer Human & lock us out."
This is wrong. Screenreaders do parse CSS and obey it, or rather the browsers do and the screenreader obeys them. Mark something as display:none and the screenreader won't see it. It doesn't work for just making the font really small, but otherwise it will.
However, even if that wasn't the case, that doesn't doom the option. I've used such fields for basic spam protection before and I have a setup for those who don't use CSS. It just includes a warning that the box below should not have anything written in it if you expect me to read your submission. If people can see the box, they can also see the warning. It won't protect you against someone targeting your page, but for the casual bot, it works.
It's usually more interesting, and often more valuable, to try to work out which organisation paid for the gartner "report" - to see what agenda they are trying to push. Gartner have no value whatsoever in anything otherwise.
I don't find captcha tests too bad, a little annoying - except when I'm travelling in, say, mainland China, stuck behind the great firewall, and unable to log in to my email provider's webmail page because all google services including the sodding captcha test are blocked.
Related (google blocked): I block everything javascript and googlely per default - so the stupid websites that insist on doing the stupid captchas don't bloody work, or only after I grudgingly accept some sites. And even then the stuff sometimes just fails. And yes, in some cases you really just have to use these websites, so sod off if "don't use those websites" was your suggestion.
I hate captchas. A Lot (as Brian of Nazareth said about the Romans). I'll refrain from complaining more, it is not good for my blood pressure...
Sadly many people with disabilities do. It gets truly farcical when the contact form to tell a company you can't use their services due to their misuse of captcha is blocked off by captcha, and then you try and register a complaint with them and..
..they end up paying far more to handle a phone call or a written letter.
a while ago I needed to submit something to an out of state court for a case I'm indirectly a party to. i tried to use e-mail to contact the appropriate person to send the info to [basically proof that I received the paperwork]. But the e-mail form had a captcha that simply would NOT work (even when run in the special context using chromium, like I usually resort to when such things are needed).
So I dug around some more on the court's web site, and found a FAX number, and I FAX'd it, by placing the item in question on the glassy copy/scan part of the fax/scan/copy/printer I bought a while back, and dialing the phone number. THAT worked. (it's why I keep that second land line).
This post has been deleted by its author
What gets my back up, is that google seem to think every street sign, fire hydrant and traffic light everywhere in the world looks like the ones in their own country.
GOOGLE - there is a whole world outside the f*cking USA you know? And it doesn't all look like your tiny little bit of the world. You could try, I dunno, maybe googling it sometime? Try "what does the rest of the world look like" for starters.
There are countries where the fire hydrants are not above ground, but under a floor plate - so they don't really exist. Traffic lights are less.... diverse I would say.
Street signs are a mess, especially if they are in a language or writing style (alphabet and other systems, like in China, Korea, Japan, ...) you don't understand: is that a street sign or ad ad for something?
I've still never seen a taxi in any of their 'find the taxis' tests. Lots of yellow cars but not a single black cab.
Similarly I see lots of yellow vehicles in their 'find the school bus' tests but not a single old, dodgy looking private coach driven by a mad man who can't possibly have ever passed a driving test.
For that matter, fire hydrants actually vary radically in different parts of the US. They're pretty standardized in the northern part of the country -- there they have to be dry until the valve is opened, or they'd freeze, and that constrains the design. But in the southern part of the US they come in all kinds of mutant shapes. They come in different colors, too. Sometimes the colors designate the flow rate.
Do these fire hydrant things even exist outside the USA?
I only know about them because of Top Cat- Officer Dibble was rather protective of them I think.
Are they a but like those overhead live powerlines you see there. An invention of states that don't want to pay for proper infrastructure, like burying stuff and having access points.
Oh, yeah. And when they say 'identify the squares with pedestrian crossings/crosswalks', and a tiny triangle of a crossing is in one square, so you check it, and then have to go through the same routine with 'identify the squares with trees in them'.
My record is five goes before I get the green tick.
Ah the great invention.
Click on the crosswalk. A what?
Click on the apartments. Well those shops, do they have apartments on them?
Click on traffic lights. Do I include the posts?
Click on the sidewalk. A what?
Click on mountains. Well is that a hill or a mountain?
Click on buses. Well I can see one in the background, does that count?
Click on the hideous lying bastard: (Picture 1 is of Trump) (P2 is of a cat) (P3 is of a dog) (p4 is of a pig).
Click on the image of a baby throwing a temper tantrum: (p1 is of Trump) (p2 is of a monkey) (p3 is of a slice of buttered toast) (p4 is of a rock with a smiley face painted on it).
Click on the failed waste of oxygen: (p1 is of Trump) (p2 is of a blackhole) (p3 is of a vacuume cleaner) (p4 is of a field of grass & flowers).
Click on...
=-D
*Runs away*
This post has been deleted by its author
You forgot to ask what "apartments" are.
Then there's taxis. Nowadays they could be any coloUr, but if there's one coloUr in particular that I'd associate with taxis, it would be black, not yellow.
It's all very well boycotting shopping sites that use these things, but this morning I needed to order repeat prescriptions, but the site forces me to enable Google's Javascript. For all I know, this could be shipping my medical records off to some American pharmaceutical company.
The ubiquity, lack of diversity and potential profitability of these things surely means that major hacker effort is being applied to breaking them.
On the other hand there are plenty of tasks that are simple to program and simple for humans to execute but beyond what it is reasonable to expect AI to do. Just pick a random question from a large collection of general knowledge things. It would take spectacular effort to parse such things.
"Just pick a random question from a large collection of general knowledge things. It would take spectacular effort to parse such things."
I hate captchas too, but it would be hard to parse the answer to that kind of question if it weren't easy to look up in a database. Questions with single answers could be easily answered automatically: "What is the capital city of Argentina", for example. More general knowledge questions might have multiple possible answers and/or regional variations "What is the food item produced by slicing potatoes and frying them". Also, if you don't have too many of them, someone can take the whole test and program bots with known answers.
The analyst suggested that good CAPTCHAs should do more than ensure users provide a correct answer to a challenge
They do already! They ensure that humans put unpaid labour into classifying things in the Google driverless car database that their algorithms can't figure out.
You know what they are having trouble with by what they are presenting, which appears pretty ominous given that they keep presenting traffic lights, pedestrian crossings, bridges etc etc etc.
COME ON... there really is NO need for captchas.
If the website programmer can not block bots programmically then they should not be creating the website. Check input client-side and also server-side then you have clean input ready to use/accept.
Trouble is we now have every Tom Dick and Harry with little know-how from "Specialist Design houses" creating fab online content, looks great but fails to deliver!
I don't mind mind captchas per se, what is annoying is that they feed Google ever more information and they are ubiquitous.
Developers! Use captchas none of whose purposes is to hoard data. There surely must be some out there that are actually there just for Turing test purposes that can be used.
I don't even mid training algorithms either. It is a clever idea but because Google's purpose to hoard as much data on everyone, I avoid them when I can.
Is the total and utter lack of localisation. The thing is written by Americans who think American culture is global. It isn't.
Taxis aren't yellow.
Neither are school buses.
Dude, your traffic lights are an abomination. WTactualF?
Around here it's called a pedestrian crossing.
Would be nice to see a Bolisha Beacon once in a while.
When I have to enter random words, kindly don't provide Hebrew, Cyrillic, or Georgian to somebody who has an English keyboard. How the hell am I supposed to enter <random squiggles>?
Don't get me started on the blurry low resolution images of random shanty towns and the instruction to "click on the shop fronts"...
But... but... everyone in the world is North American. Everybody. Absolutely everybody... surely this is so? It cannot possibly be not so? (breathe) But North Americanish culture is everywhere. Everywhere has the same ideal don't they? Except for the red commie scum of course. They don't. Everybody else in the world does though.
The US: One of the three remaining backward regimes in the world which still uses Imperial measurements, believes that their culture applies everywhere and persistently tries to brainwash the world into thinking that the US has a democracy.
The problem for me is poorly implemented captcha, which means Google.
As far as I can tell, Google flat-out blacklists all Tor exit nodes.
It doesn't matter if you're human, or how many captchas you complete : Google will not let you continue. You just get another captcha.
IME, captcha works, and they are *necessary* : if you have an open forum or mediawiki or the like, in a few days you will be drowning into automated spam posts and articles. I remember when I realised my mediawiki had thousands of automated spam articles...
However, they can - Google - be implemented in a way which makes them a critical problem. For me, any site using Google captcha is a dead site. It may as well not exist, because I can't get past the captcha. In this case, the captcha solves one problem but introduces another.
Also, of course, with Google you have the problem of spying. They're collecting information via the captcha.
As far as I can tell, Google flat-out blacklists all Tor exit nodes.It doesn't matter if you’re human, or how many captchas you complete : Google will not let you continue. You just get another captcha.
In my experience, it can take between one and several CAPTCHA rounds, but a Tor exit node will be approved by Google eventually. It does require both https://www.google.com/
and https://www.gstatic.com/
to be allowed to run JavaScript, though.
I must have done something right. Google wont even let me do a search before insisting on a CAPTCHA. It insists I am a bot or my computer is sending traffic. Which is odd as my network switch is inactive, except on page clicks. More business for seach.brave.com
Certainly evil, and probably also a genius, if they can do a PDE.
I'm afraid I can only vaguely remember "junior level" differential equations from what I learned at school and early university studies. I'm afraid I've forgotten most of the "more advanced" (still below degree-level) maths that I ever learned, as I just don't ever need to use it in real life (I realise some people do), so that definitely would be rather exclusionary!