Could it be psychological manipulation or emotional on the part of a robot?

As humans, we are emotionally susceptible to a range of phenomena. The film industry, for example, knows how to make us laugh and cry. You can get situations where most of us will be on the verge of tears or on the verge of a big laugh. And this is why the film is so successful, because it wields -and why not say it – manipulates our emotions.

Alert in Mexico by nRansom, the Ransomware that ask nudity for free files

And this, although without doubt it is something that clearly can’t make the robots, is perhaps the weak side by which we can be manipulated more easily. In a study of the University of Duisburg-Eseen, Germany, has been found after a series of experiments, that human beings can be manipulated emotionally by robots. The issue is not new in reality, because already in 2007 a group of volunteers participated in an experiment simple: there was a robot that had to be disconnected, but the robot begged for the participant to human not to do so. And the men and women in the experiment showed some reluctance to disconnect it.

In this new research, which included 89 subjects, were asked to disconnect a robot that asked them not to do so. 43 people the robot would have pleaded with to from signals verbal and non-verbal), that is to say, gestures and body movements. The rest of the participants, who were the control group, only established in this series of prayers verbally.

The results were curious: 13 people -of the 43 – refused to shut down the robot while the rest of the volunteers, it took much more time to do so than the control group. The researchers concluded, then, that humans tend to assign certain human qualities to robots, which makes them vulnerable in decision making. We will, the conclusion is simple: the robots we can manipulate you psychologically. The volunteers were eventually interviewed after the experiment. Those who refused to turn off the robot, they said that they did not do so because he had asked for. Others claimed to feel sorry for the robot, or concerned about committing an error when shutting down.

Though here you had a robot, the conclusions can be extended to other areas. For example, the use of this scheme of emotional manipulation to commit fraud or blackmail, even on social networks, or in online environments, that is, if we are talking about the potential of social networks by some individuals to seek to commit some misdeed.

The article that describes this work can be read here: Do to robot’s social skills and its objection discourage interactants from switching the robot off?.

Coolest-hacks.com and Partners.

Check out more Related Articles around Cool Life Hacks