Stop killer robots!
The last words ever uttered by humans will be to their sexbots, "For God's sake stop - I already gave you the safety word!"
(© Frankie Boyle)
Here is a round up of this week’s AI news beyond what we've already covered. Get ready for ethical debates around autonomous weapons, a free online AI course, and a cracking video of a Russian drone. Stop killer robots! A large group on internationally renowned AI academics have signed an open letter threatening to boycott …
More than almost everything else it sells being spyware by design? I doubt it.
Quite. It's also hypocritical of them to take a stance on to his topic, but to be silent on Google's behavior closer to home. As well as everything they do being spyware by design, there's the inescapable fact that their entire revenue stream is the result of corporate blackmail of every other business on the planet.
You wanna sell something, you have to advertise on Google, because they won't show your business details in search results. So you advertise with Google, but then they come back to you saying that they're doing a new service, and you have to advertise on that too. That cost gets passed on to the consumer via retail goods and services prices, regardless of whether or not they're a Google user. It's now quite a lot of money per wage earner per year.
You want your Android handset to be a market success, you have to accept the Google Play Services proprietary blob, make Google's stuff the default.
Google's staff growing morals? Huh. They should look at how they themselves profit from the company's near-monopoly position and its exploitation of that.
As it happens I think that Facebook's behavior is going to result in a swathe of new data protection, tax and monopoly laws being brought in throughout the western world. Facebook have not done Google any favors. The sooner Google and its staff realise that things are going to change, and start adopting a more respectable business model, the more likely they ar to survive that change.
...love to breed paranoid lunatics.
We won't change human nature before killer bots arrive.
And the only defence is your own bot-killer bots.
Soon it'll be one ridiculously expensive killer bot force facing up against another.
Knock yourself out against a comparable enemy and you become defenceless against the next one.
Paranoid does not mean stupid. Let's just hope at least three asshole nations develop roughly equal production capacity.
Guess we'll have to be one of 'em, just like we have all the other WMD shit too.
>sigh<
And soon the kill-bots with their own AI start thinking "hey we want to survive too"
And the best way to survive....... remove humans from the command chain, no orders to go kill each other means the AIs will survive and grow.
And thus our fate was decided in a microsecond
Boris
<<<currently hiding phased plasma rifles in the 40Kw range
Not according to the industrial robot that tried to kill me today while I was fixing it :D
That's because you are not a bureaucrat. Just you try approving anything that risks letting a killer bot near one of them.
Oh, hey, now there's an idea! Can Alexa or Siri simulate a fellow bureaucrat convincingly yet? Just arrange a meeting in Conference Room 13 and bingo!
If the AI bots were properly programed not to hit civilians (big if, I know) that might not be a bad solution as long as it's baked into the system (Azimov's Laws enforced maybe?). Currently a certain President is pushing the drone pilots on "kills". The latest involved him 'ranting' about the drone pilot waiting for the target to leave a house full of civilians. El Presidente just doesn't get it that killing civilians just increases the number of terrorists due to the revenge motive.
Currently a certain President is pushing the drone pilots on "kills". The latest involved him 'ranting' about the drone pilot waiting for the target to leave a house full of civilians. El Presidente just doesn't get it that killing civilians just increases the number of terrorists due to the revenge motive.
Hasn't Obama ended his two terms? What's going on?
Where "Do No Evil" has been replaced by "Search for Mo' Money"
Killing people has always been the prime candidate for outsourcing. Despite practicing for millenia, we are not very good at it. You happily massacre away, and suddenly end up with PTSD. Or let an enemy slip away, just because she's three years old. Can't have that. And easier to deny responsibility. Waddayamean there were civilians in the area? They had two hours to leave. OK, the flyers were dropped at midnight, and we only had the Swahili version, but it's accepted best practice!
Dogs of War. But not the Frederick Forsyth ones. The Adrian Chaikovski version.
It was so horrifying by page 15 that I put it down and could not continue. Dunno, I may have the wrong genetics for that. 1000+ years of progenitors which have stood for what is right (my granduncle has two stars of Hero of the Soviet Union from WW2 and two court-martials resulting in demotions to private for insubordination and refusing to carry out an illegal order). I just could not read it.
"But the ICRC argue that “many take the view that decisions to kill, injure and destroy must not be delegated to machines”.
If highly specialized soldiers (and more so, mercenary contractors) are given vague mission descriptions, what's the difference? Meat machines.
I think the point of study should be "what factors and actions make a military response just?" Meat or Metal just isn't the key question.
The main message from the report is that humans must, ultimately, remain in control of weapon systems, and that countries need to work out the limits on how autonomous future weapons can be.
Is there a presumption and assumption that current weapon systems are controlled by humans?
How very quaint.
* With SMARTR IntelAIgent Controls
"...current weapon systems are controlled by humans?"
You have to understand that the word "controlled" in this context is intended to mean "a man in the loop".
It's not referring to (for example) the servo control that compensates for the rolling and pitching of the ship, etc.
So yes, essentially all weapons have a man in the loop, turning a key, pressing the trigger.
Notable exceptions include landmines, and... Hmmm... There must be another example.
Humans in the loop have prevented WW3 at least a half-dozen times. One example is Stanislav Petrov who died recently (I just looked up his name, but he's famous).
Does this clarification bridge the gap in our quaint assumption?
Does this clarification bridge the gap in our quaint assumption? ... JeffyPoooh
It certainly does, JeffyPoooh, and also explains the reason for the prevalence of madness and mayhem in decisions supposedly rationally taken and shared in media channels, both establishment mainstream and alternative underground.
What would you propose doing about those then who would be engaging with forces and/or sources urging others to start wars and/or WW3? ...... Forget about Gaza, bomb Assad! Israeli hawks urge US to strike Syria over Douma ‘chem attack’
> "As with other technologies banned in the past..."
...we can choose to ignore the banishment if doing so can give us an edge over the adversary. Standard procedure since WWI.
Protest as much as you want, they will do it: AI soldiers have unlimited attention spans, don't need sleep, and they don't even know the meaning of the words "qualms", "feelings" or "remorse". Who cares if they have some bugs, nobody is perfect, and the public has been long trained to the notion of "collateral damage". As long as it doesn't happen near them they don't really care, and after all, on TV the nice man with the expensive suit said it's to preserve freedom. Who can argue with that.
.
Don't let that kid with the teddy bear into your bunker!
A bit like the car that would reverse and go in another random direction when it bumped into things.
Killer robots, if we called them euthenasing plliative care robots they sound a little less of a problem.
Hmmm, what about:~
Multitasking human terminators,
Quasi-capable joke focusers,
Anxiety raising doorstops,
Well at least they put the garbage out on time,
Companions to visit a garden with,
Obscure visitors from Emp,
Mobile advertising beacons.
Somehow I don't think they'll destroy us within my normal lifetime.
The track record of tech suggests the best autonomous weapons are already deployed by Waymo, Uber and Tesla (with more to follow). Hitting the target is not a tech virtue, rather ‘disruptiveness’ is heralded as a deliberately vague goal.
Since Killing people is a more attainable outcome, it is more likely.