Lol @ Who's your daddy. Hadnt seen that one anywhere.
The problem with siri is that it still seems like its responding to a query in a predetermined (though not completely predictable) way and not really having conversation. Thats what is going to the big leap in such AI assistants. Like in this convo (assuming siri didnt pull up a fake joke calendar at the end there, which is highly unlikely)...
It correctly assumed that Joshua was joking when he said he had killed someone. But as soon as he responded with a query which could have been a legit question if he hadnt started this convo as a joke, siri gave the proper answer instead of continuing with the joke or telling the guy to knock it off and get back to work. It didnt understand that Joshua meant to ask what to do with the the fact that he killed someone.
In an ideal world, Siri (or other such AI assistants) should be able to remember all their convos with its boss and be able to learn and decide when he is kidding and when he's serious instead of simply trying to come up with the best possible reply for a specific query. Now that will be something really awesome.