I believe I previously posted a link to full length interview with him, largely centered on his concerns.
I've personally come to some sort of acceptance of this risk. I figure odds are NOT good of humans surviving and thriving going forward.
Even without AI we, as a species, seem so self-destructive, short sighted, and generally dumb that it continues to be very likely we will simply drive ourselves extinct.
So, if that is where we are otherwise headed, wouldn't it be better to just hand over everything to machines first?
If they wipe us out, is that any worse than if WE wipe us out? Maybe it is better that way?