Insanity.

https://www.forbes.com/sites/kevinkruse/2018/11/28/your-boss-will-be-a-robot-and-you-will-love-her/#566b82533264

3. You and your robot boss will care about each other

Well to be more precise, you will care about your robot boss and you will feel that your robot boss cares about you.

Does your current boss care about you? When is the last time they asked you about your weekend? Do they know the names of your kids? Do they even know if you have kids?

Don’t worry, in just a few short years your new robot boss will know about your family, friends, and hobbies. When you throw a party on Saturday for your son’s birthday, your robot boss will ask on Monday, “How was Owen’s birthday?” After the Dancing With The Stars season finale (yes, that show will still be on), RoboBoss will say, “Did you watch it? I told you Hillary would win!”

And by running continuous sentiment analysis on your language, and scanning your facial microexpressions, RoboBoss will know if you are happy, angry or just a little bit stressed.

And you will care back. We already do. It’s called the Tamagotchi effect: the tendency of humans to attach emotionally to inanimate objects. Dr. Julie Carpenter has documented how US soldiers feel loss when their field robots are destroyed. Researchers at the Georgia Institute of Technology have studied the “intimate relationship” that people experience for their Roomba vacuuming robot. And if that isn’t weird enough, men in Japan have begun to marry holograms.

From the Forbes article mentioned in previous blog post by my colleague. So much is wrong about accepting robots as caretakers of our emotional states based on an illusion of care.

This could be the most dangerous thing humans have done since the creation of mythic skygods.

This part truly irks me:

Does your current boss care about you? When is the last time they asked you about your weekend? Do they know the names of your kids? Do they even know if you have kids?

Basically this is saying because someone doesn’t fall in someone else’s Dunbar Radius of caring we should replace that person with a machine with no care at all? Controlled by another company? How could this ever work out to our benefit.?

Not only is this blithely immoral to state, but it is also far far more difficult than the author supposes. Not to create a machine that simulates caring, but one that actually knows what to do to accomplish a goal, to respond to change.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.