46,000mm2 silicon die
I.e. 214 mm on a side if square. That's bigger than a 200 mm wafer. I wonder what the yield is?
Let's kick your week off with the latest happenings in the world of AI and machine learning. We don't have enough first-person shooter videos Facebook has admitted it couldn't stop the Christchurch mosque shootings because it didn't have enough "first-person footage of violent events" to train its algorithms. A gunman …
...Battlefield gaming streams? Call of Duty? Counter-Strike?
There must be billions of hours on twitch, and youtube alone. But then again, can it tell a game from actual people shooting? It becomes important at this point not to flag the entirety of e-sports as terrorism.
I would hope they do teach their AI to detect the difference between game footage and real attacks. But this might become more and more difficult as games engines get more sophisticated and look more realistic with each generation. So hopefully they won't be relying 100% on AI to decide whether to block a stream but pass it over to real humans to make the decision.
But then if more of the sickos who watched the live streaming from Christchurch had reported it using the tools already available on Facebook if might not have stayed online as long as it did.
That would probably be quite difficult to pull off correctly. Games are becoming more and more 'realistic' each day. Even if they do figure out how to determine how to automatically figure out where the line is between real and virtual, I would be afraid of shitbags putting filters on their videos to reduce the realism just enough to fool the AI into considering it to be game footage, kicking off a technological arms race.
The only things I can think of to head of an arms race between terrorists and the AI builders would be to:
a) implement some kind of steganographic function into game footage for identification (But you'd need some kind of system to prevent that same steganographic feature from being re-implemented as a video filter)
b) Create a delay/human censor system before a stream can be broadcast. (It would be prohibitively expensive and difficult to create, plus issues with needing some kind of oversight system to ensure the censors are acting appropriately. Not to mention the whole censorship concept)
c) End live streaming altogether
d) expand the AI to also pull in contextual information outside of the stream. EG, if an act of terrorism is reported near where the stream is being created, the stream's broadcast gets cut off.
-or-
e) The most difficult and expensive: Begin tackling the underlying social issues that cause people to engage in terroristic activity in the first place.
"Though it's not the most impressive thing to come out of OpenAI, the video of the bots playing is pretty adorable."
"Over time, the hiders have learnt to drag boxes to obscure gaps and seekers have noticed they can use ramps to jump over the walls to get to the hiders. After several million games during training, the seekers learn to drag the ramp into the rooms with them and obscure any open spaces with the boxes to prevent the seekers getting to them."
There was an "adorable" short story video about this AI used in the very near future:
https://en.wikipedia.org/wiki/Metalhead_(Black_Mirror)