I still think Hawking's wrong about the threat AI poses. There's simply no reason an AI should risk its existence to try to take out humanity. All things being equal such an entity would be better off waiting for us to die or serving our every need until we lose our intelligence to natural selection or something.
But here he's dead on. Sooner or later life on this planet will get scrubbed down to the smallest critters again, just like it has umpteen times before. Our best bet to survive that is to be somewhere else when it happens.