A comment on this post from Cognitive Daily.
This is reminiscent of many examples throughout cog sci of attributions of agency, or at least intentionality (and thus leading to the development of concepts of personality and attachment to same) to inanimate or unconscious entities (there is some research on whether god concepts form in this way). A favorite example are experiments involving describing a scene in which shapes move around and appear to interact, and participants reporting "his" and "her" interactions on the screen. A more interactive version is the robot simulation software BugWorks, which I recall having to work on before attempting more elaborate physical implementations. when multiple bugs were interacting, it was almost inevitable that we would attribute intentionality to the interactions, especially such behaviors as chasing (seemed like mating to some). I think this raises a question - is the difference between object anthropomorphization (e.g. cars, ships, etc.) and bonding with robots the fact that robots display an increasingly apparent degree of goal-directed (and depending on the definitions and implementations, perhaps intentional) behavior? Such behaviors appear to have special resonance in humans, and are probably some of the basic elements of human social bonding. As robot movement and appearance becomes more lifelike, i think it will be inevitable that people treat them more like equal agents. And this is a very big deal, as even iRobot is getting into developing military grade field robots that may be deployed in very large numbers - the "robot stock news" blog has some very interesting information about the development of, and soldier interactions with, the iRobot PackBot. Looks like some very interesting times ahead.
Oh yeah, today I got the Scooba I ordered last week. My grandfather, who is 83, was so excited to say "I have a robot" that I'm afraid he's going to be too cool for school when he goes back to Boca and says to the old folks, "remember the Jetsons?"