My friends were puzzled: "It had a five star rating on Amazon"
I took out my laptop and checked the product page. Sure enough: 37 five star reviews. But this was unmistakably a lemon. What's that?
Mystery solved: Every single review was a fake.
Fake news, meet false reviews
What is a false review? Exactly what it sounds like: a review published by an employee of the company, paid individual or someone else with an interest in selling more product. Last year, the Sunday Riley skin care brand caught encouraging employees sending fake reviews on Sephora.
This is a serious problem, and in my capacity asI see all the time – mostly with products sold by small or foreign companies.
One or two fakes: no big deal. Lots of them: now you have an artificially inflated product assessment. It is quite easy to look at a four or five star mean and think, "OK, it must be good!" Some people will take the time to dig into each review – or any reviewer – to look for red flags.
Here is a good example: You are in the market for a GoPro-style hand camera. A real GoPro will drive you hundreds of dollars, but there are countless knock-off rates as low as $ 40 to $ 50. But they can't be that good, right? Well, they see as GoPros. They come with lots of accessories. And here's the kicker: high scores from dozens or even hundreds of reviewers. Sold!
The problem is dozens or even hundreds of reviews can be false – or at least questionable. It is difficult to know for sure but there are telltale signs. More about it below.
But shouldn't Amazon do something about this? A few years ago, the company promised to launchwhich means that they are sent in exchange for free or discounted products. Sure enough, I no longer see reviews with that disclaimer embedded – but that doesn't mean there has been a reduction of unauthorized reviews.
In my world, where I often write about lesser-known technical brands and products, not much has changed. So let's talk about the tools you can use to discover false reviews and – just as important – how to interpret the results.
X marks Fakespot
First up is Fakespot, a free site that analyzes product reviews to help you separate wheat from, yes, fake. All you do is copy and paste the link to the product page and then click Analyze.
You can also use a browser for Chrome, Firefox and Safari, making it even easier: just click the Fakespot icon in the toolbar for instant analysis. It is also available for Android and iOS so you can use the Use Fakespot on the go.
Fakespot originally focused its algorithms on Amazon only, but later added on TripAdvisor and Yelp. Last week, the company introduced search engines for Best Buy, Sephora, Steam and Walmart. (Incidentally, of the new additions, Fakespot found that just over 50 percent of the Walmart reviews were "unknown and unreliable", while fewer than 5 percent of the Best Buy reviews were the same.)
The system analyzes both reviews and reviewers, looking for after questionable spelling and grammar, number of reviews, purchasing patterns, improper dates and other suspicious review messages. For example, a reviewer new to Amazon has just written a review and uses lots of words like "good" and "amazing"? This review will almost certainly be marked "unreliable".
After the analysis is complete, Fakespot gives a letter quality based on the total number of reviews and how many were unreliable. And that's where things can get a bit confusing: If you look at one of the above mentioned cameras and it becomes a "F", for 57 percent of the reviews were marked as unreliable, you might be much less inclined to buy it.
Ah but does it mean that the product itself is bad? Not necessarily. More about that in the next section.
Next, it is ReviewMeta, an Amazon-only analyzer that takes a completely different approach, according to developer Tommy Noonan. Although it is functionally alike – paste it into an Amazon link or use one of the browser's add-ons – ReviewMeta just shuts down or reduces the importance of some reviews and then leaves an adjusted rating.
In other words, instead of the letter, which can be misleading, ReviewMeta shows what Amazon's average rating would be if the questionable reviews did not exist.
Here is what becomes interesting: often Fakespot and ReviewMeta reach much various conclusions about a product's reviews. I've seen it happen where a tool gave the reviews a pass and the other said they were mostly fake.
Graduation the graders
What can we do about all this? If we can't always rely on reviews shared by Amazon customers, can we rely on the reviews of these reviews?
It is a challenge to be sure. As Noonan told me, "It is impossible for anyone to definitely determine if a review is" false "or" true. "Not even a human being can do so, so it is impossible to really determine how" correct "Fakespot or ReviewMeta is . "
Noonan says he designed ReviewMeta with that in mind, and that's why he shares as much detail as possible on the reports. "The tool is not really meant to give you a black-and-white answer," he says, "but more to show you all the data that we possibly can and then let you make your own decision."
And I keep in mind that this is the key space here: Be aware that the product's rating may be artificially inflated and use tools such as Fakespot and ReviewMeta if you think you do not get a correct image. At the same time, you are aware that these analyzes may also have accuracy and that they do not necessarily reflect the quality of the product.
The Atech ear samples shown throughout this story are a perfect example. They have a 4.3-star average rating from 16 Amazon customers, suggesting a solid product. According to Fakespot, however, only about 62 percent of these reviews are reliable. ReviewMeta puts the number at only 50 percent and leaves the earplugs with a lower rating as a result: 3.9 stars.
My advice: Take everything with a grain of salt. Don't believe everything you read. Use common sense. It is good advice if you shop at Amazon or, you know,.
Have you had a submission with false reviews? Have you ever bought something that knows the reviews were dubious? What was the result?
Originally published February 20, 2017.
Update, March 4, 2019: Added new information.