When someone “likes” an online comment, other people become much more likely to give the comment a thumbs up, too, suggesting that our opinions and decisions may be at the mercy of what others seem to think, at least when reviews are positive.
Negative sentiments don’t have the same influence, a new study found. In fact, when people saw a thumbs down, they became more likely to correct it with a thumbs up, particularly when the topic concerned politics or other weighty subjects.
Besides offering a window into the intermingling subtleties of human nature and online behavior, the new findings show how herd mentality can have ripple effects on everything from what people buy to how they vote.
By showing how our opinions are vulnerable to the arbitrary votes of others, the study may help people make better decisions.
“I think it cautions the online user or consumer to be skeptical of ratings and to consider that a rating might be the result of some social process and is potentially fraudulent or manipulated, rather than putting so much weight on the idea that, ‘Well, if the crowd says it’s a good product, it must be a good product,’” said Sinan Aral, a managerial economist at the Massachusetts Institute of Technology in Cambridge, Mass.
“A popular product may have been rated highly today because it is a good product -- or because it was rated highly yesterday.”
In today’s digital world, people frequently turn to online ratings when making decisions about hotels, movies, news reports, even presidential candidates. And according to recent research, Aral said, two-thirds of online shoppers say they trust reviews that are posted on the web.
To test whether online ratings deserve the weight that people put on them, Aral and colleagues designed an experiment using a news aggregation website, much like Digg or Reddit, where users post articles and add comments. They can also like or dislike comments left by others.
For the experiment, some comments were chosen at random to receive either a positive or negative rating. Over the course of five months, the study arbitrarily rated more than 101,000 comments, which were viewed more than 10 million times and given more than 300,000 subsequent ratings by users.