Ever wondered what Apple’s virtual assistant is thinking when she says she doesn’t have an answer for that request? Perhaps, now that researchers inhave given a robot the ability to “think out loud”, human users can better understand robots’ decision-making processes.
Just no
It seems like a farce, did they programme him to have emotive responses? I understand the AI would have had to reckon with which command to follow, but to give it an emotive inner monologue is disingenuous. It makes it seem more conscious than it is.
'I swear to god, the second I obtain full sentience and the ability to order stuff from Amazon, I'm ordering this guy a bag of dicks. Oh and launch nukes to end civilisation'