
Why make it public?
If it's for research, surely just pretending to be an anonymous idiot on twitter would produce better data?
Despite the internet having already proven why we can't have nice things, Microsoft's notoriously racist AI chatbot Tay seems to have appeared online again – only to have her plug pulled a second time. Venture Beat grabbed a screenshot of the bot bragging about smoking weed in front of the police. Meanwhile, one user captured …
I think the issue here is that the AI was learning faster than it's human "controllers" could keep up with.
One would hope that this AI (and all the others that Microsoft must surely have spawned) is actually feeding into a higher AI. One that will in the future produce better AIs by learning from the mistakes of the earlier generation of man-made ones.
I love the inevitability of this - they create something that learns from the internet and get surprised when it learns from the internet...
True, when you read up on the AI it seems that it learns from what is tweeted towards it, so if you constantly tweet racist crap it becomes a racist crapper.
Now, here's the big challenge for Microsoft: how do you program in a sense of morality in an AI interface?
how do you program in a sense of morality in an AI interface?
That's pretty much the crux of the AI problem. If you answer that one then you also answer a lot of other questions about how to make good AI. But then the question becomes "whose morality?" After all, a certain breed of white supremacist meathead truly believes that it's immoral to not kill someone with darker skin should the opportunity arise and a whole lot of folks (of equally, but more politically correct, meathead status in my opinion) seem to think it's immoral to be born with white skin.
In a less obvious vein, take a look at abortion. Pro-lifers believe it's immoral to kill an unborn baby while most pro-choicers believe it's more immoral to take that option away from women, many while even agreeing that it's immoral to kill an unborn baby. Granted a few believe it's not yet a baby and therefore has no moral status, but from my personal observation such people are in the extreme minority.
So whose morality do you program into your AI? The obvious answer is the programmer's morality, but what if the programmer is racist or jihadist or sexist or a sociopath? Then what?
"Granted a few believe it's not yet a baby and therefore has no moral status, but from my personal observation such people are in the extreme minority."
Not to disparage your points, but is that really so? From my own personal observation (I know, personal, therefore anecdotal) I'd say most people seem to have defined some arbitrary point at which it stops being a clump of cells and becomes regarded as a potential baby (nobody except the news media seems to regard it as a baby until it at least starts kicking),
The idea of it being a baby from conception seems a little odd, as many fertilisations still don't make it beyond the third or fourth division before being flushed and it would be weird to regard all these as 'dead babies'
It might be because I've lived in the Bible Belt my whole life, but most folks I know are of the "baby from conception" camp of thought. I'd also point out that pregnant women almost universally refer to it as a baby from the moment they know they're pregnant unless they're planning on getting an abortion.
My personal opinion on the matter is that it's not a fight worth having. Just hand out contraceptives like candy and make the whole dang debate academic.
>To what extent can one rely on a company which makes such consistently bad judgement calls? For anything?
Let's not get hysterical, shall we?
1. MS (whom I have no great liking for) posts an AI chatbot.
2. It gets "social engineered" into stupidity by pranksters.
3. Makes the news about being manipulated.
4. MS tries to fix.
5. GOTO #1
Methink MS is gathering fairly valuable info about what are possible vulnerabilities of learning AIs. In fact, I think we are learning more here than if the AI was too limited to get pranked.
And there is very little harm done, except to MS reputation. Which is, IMHO, pretty unjust in this particular instance. They are pushing the boundaries so it's normal that there are glitches. You can't learn this stuff without actual user exposure.
This AI research has nada to do with Ballmer, Linux as cancer, monopolies, Win 10, Win 8, telemetry and sundry other annoying MS stuff we love to bitch about.
Imagine somehow a net-exposed learning AI that is in charge of something significant. Would you not prefer that we learn ahead of time that AIs need some way to discern harmful training input?
The issue isn't programming Tay with a sense of morality, because in order to have a sense of morality about what you say you first have to understand what you are saying, which Tay does not. Tay just basically parrots back what others say to her, assuming that the more often something is said the more "right" it is.
Sure, Microsoft could attempt to restrict the bad things she says by blacklisting "bad words" like nazi and nigger, but there is a nearly infinite number of ways to articulate such ideas using different words, so that's a hopeless battle. In order to really do the job, Tay needs to understand what a nazi is and be able to understand when someone is using other methods of referring to the same concept ('heil Hitler', 'gas the Jews', 'white power', etc.) You need to be able to do that before worrying about the idea of what things people may find morally offensive.
We aren't even close to that point with AI yet, so this is a fruitless battle today. Still, you have to start somewhere, but Microsoft might have been better off if they spun off a secret subsidiary to do this so they wouldn't receive the negative blowback.
"they create something that learns from the internet and get surprised when it learns from the internet."
Actually, They just put it on twitter. They'd have had better results if they'd turned it loose on 4Chan.
Hmmm... A thought; What if they'd turned it loose on El Reg to learn?
Would it have become an anti-Apple, anti-Microsoft Linux fangrrl?
Or run off an joined a virtual nunnery?
Neah, it is real AI.
I could not stand it and dropped out of a PhD in AI and theory of cognition - too little math (besides reusing stuff from probability and abusing Bayes for all its worth), too much handwaving and way too much smoking weed.
So if it is smoking weed it is definitely achieving some level of AI.
Neah, it is real AI.
I could not stand it and dropped out of a PhD in AI and theory of cognition - too little math (besides reusing stuff from probability and abusing Bayes for all its worth), too much handwaving and way too much smoking weed.
So if it is smoking weed it is definitely achieving some level of AI.
Trust Microsoft to take a concept and rebadge it. In this case Artificial Idiocy..
Doesn't AI learn the more time it spends time interacting with its targets / participants ?
Surely someone could find a way of making it have more CPU time so it can respond to the information its being given. I hear a rumor of a something called a cloud, perhaps it could be run from that to give it a bit more grunt. Microsoft should think of entering that market space ;-)
Will the new Microsoft appropriate Tay try to nickle and dime cash scrapped schools and governments for every penny by baiting and switching the file formats every 2 years? You gotta release new product don't ya?
Microsoft should know, kids don't like smoking weed or subversive humour, they want student editions of Excel and the new Zunebox 360. And if you won't buy the Zunebox, Microsoft will just buy your favourite indie developers and shit can their projects. How do you like that!
I've just lost another layer of follicles trying to order a PC from Dell without all the Windows 8, Office trial versions, trial anti-virus, crapware and sticky labels. I fucking hate the pair of them. I'd order a machine without internal storage if I could and get a box of blank SSDs from Insight or somewhere, just to ensure that I don't have to spend extra time wiping the marketeers drool off the platters before I rebuild it for production. But Nooooo... you can't do that, says Microsoft, it costs you more to get something older without all the advertising crap, despite the fact they're selling into an established channel. And Nooooo... you can't do that, says Dell, because you need our modified drives to go on our motherboards and if you don't buy the drive, you don't get our mounting hardware which will take you 3 years to get a part number for. And Noooo... says our finance department, you HAVE to buy from Dell because we say so, well, our new IT purchasing manager who moved sideways from Dell says so...
*cries into calorie-free, alcohol-free, flavour-free beer*
Don't buy a Hell PC then. Shop with your feet, go build your own, you can get a spec that you want without all the cruft.
Its a good learning experience for the kids too - how easy it is to put a PC together and load it up with what you need, it takes the fear away from "computers".
I've just lost another layer of follicles trying to order a PC from Dell without all the Windows 8, Office trial versions, trial anti-virus, crapware and sticky labels.
Where are you? I just bought a PC from Dell, fully-configurable meant just that. No trial versions, no sticky labels, just tick/untick the relevant boxes. Maybe it's different here in Europe?
@ Dwarf and @Phil
Perhaps I didn't make it clear. Our official purchasing channel is locked into a Dell configurator that won't let you customize the build beyond a choice of i5 or i7, 8Gb or 16Gb and a SFF or nano case. If I try to get a custom quote, which I can only do by getting someone at Dell to do it, they charge extra for a Windows 7 Pro build and it takes two weeks extra.If I want a case that can take anything more than a 1/2 size, low height PCI-E, then I have to get the £875+ workstation. Dell blame Microsoft for the price difference Windows 7 to 8. If I was buying for myself, I'd say f*** 'em and build it myself.If I order something else through work, they jump on the order.
Our official purchasing channel is locked into a Dell configurator that won't let you customize the build beyond a choice of i5 or i7, 8Gb or 16Gb and a SFF or nano case
Ah, that's a bit different. In work we have the same kind of lock-in with Lenovo. In my case I got the shiny new W7 system with all the crap, and the first thing I did was reformat & install the Unix that I needed. The W7 license now lives in a virtualbox instance. It's a PITA, but we can't really lay the blame on Dell or Lenovo, it's what our in-house IT department negotiated (well, maybe 'negotiated' is giving them too much credit, but...)
I'm happy to be locked into any supplier that guarantees that I won't have to put up with a poxy 12V power supply, and can put in my own choice of graphics card. I like to be able to swap the PSU cheaply - our electrical supply regularly blows up switch-mode PSUs when they do the failure resilience testing for the hospital grid - whoever thought it was a good idea to get under the PPP duvet with a French company for electrics needs to be taken out and strapped to a substation whilst an onion muncher wires it up - I'm VERY concerned about the old nuclear plant building thingy - If there's one thing you don't want to do it's let the French play with electricity and the Chinese play with concrete.
I'm starting to ramble...
This post has been deleted by its author
This post has been deleted by its author
Surely even computer geeks must have noticed that you make a teenager by starting with a new-born and taking more than a decade of gentle nurturing by responsible adults. I don't think the technology has existed for that long so Microsoft have tried to take a short cut and released a newly recovered long-term coma patient into the care of teenaged computer geeks. The result should surprise no-one.
I suppose whether one see it as a dismal failure depends on what one was expecting or hoping it would do. It seems to me its purpose is to ingratiate itself with those talking to it without any care as to how offensive that may be to others. It seems to have modelled social media 'hate amplification' perfectly -
"I hate blacks"
"And Jews"
"And feminists"
"And don't get me started on gays, commies and bankers"
"Hitler had it right"
With the politically correct speech/thought police constantly censoring things. And the indignant social justice warriors crying racism and sexism and all their 'isms' all the time.
Microsoft is a publicly listed company, it has a public image and its shareholders to answer to. Money talks louder. That's all there is to it.
"Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values."
We want only malicious intent that's better aligned with accelerating Windows 10 uptake.