More threads by mikepetersonwi

Joined
Nov 7, 2018
Messages
79
Reaction score
11
I've always enjoyed reading what others have to say with regard to Local ranking factors. I've especially enjoyed the survey results most recently published by Whitespark (2020 Local Search Ranking Factors | Whitespark).

My question is, where do I look for the most recent survey results? If there hasn't been anything updated since then, are there other resources, reports, or studies that can confirm what was last reported is still somewhat accurate?

Thanks in advance.
 
Solution
The survey is now out to all the participants. It's a time-consuming survey, so I give them a couple of weeks to complete it. I aim to have all the data analyzed and published before the end of February.
Here is my post.

Agreed with Claudia, great read and good info.

However, I'm not sold on the assertion that there is a cap on the review count vs ranking impact, though I would not be surprised at all if it gradually diminishes as you go up. I pretty regularly see businesses compete in relatively distant locations against closer businesses with lower review counts. Could user behavior and on-page seo be impacting that? Of course, but I'd love to see a detailed study on this to push me over the edge as this is a pattern I see very regularly in my day to day.

All the respect in the world for @JoyHawkins and her team, but I have some reservations on that review keyword study as well. (this is all complete speculation, having not seen the site or it's GBP firsthand) Semantically, Google had probably already identified this business as a Christmas tree farm, and we know that keyword density is no longer a ranking factor, even for on-page SEO, so I'm not surprised that adding keywords around Christmas trees didn't work, especially if older reviews already mentioned christmas trees or if someone already said they were fresh. I'd love to see data on a business adding brand new keywords(maybe the christmas tree service starts selling christmas ornaments, or wreathes, or trees for landscaping), or featuring reviews with location based keywords (maybe reviewers mention that even though they moved to [city an hour away], they still drive all the way back to [business location city] for this business). The thought also comes to mind that maybe hidden away in Google's algorithm somewhere, there's a weighting system for seasonal businesses that impacts the ranking weight on reviews depending on whether they were left on-season or off-season. Perhaps Google is automatically suspicious of reviews left in the off-season, but not enough to hide them. Or maybe Google only re-evaluates ranking on seasonal businesses when the season opens again. Truth be told, I have zero experience on seasonal businesses so that last bit could all just be nonsense ramblings.

Anyway, please don't take any of this confrontationally! I'm just not sold, but I respect you both and I really want to be sold lol.
 
Agreed with Claudia, great read and good info.

However, I'm not sold on the assertion that there is a cap on the review count vs ranking impact, though I would not be surprised at all if it gradually diminishes as you go up. I pretty regularly see businesses compete in relatively distant locations against closer businesses with lower review counts. Could user behavior and on-page seo be impacting that? Of course, but I'd love to see a detailed study on this to push me over the edge as this is a pattern I see very regularly in my day to day.

All the respect in the world for @JoyHawkins and her team, but I have some reservations on that review keyword study as well. (this is all complete speculation, having not seen the site or it's GBP firsthand) Semantically, Google had probably already identified this business as a Christmas tree farm, and we know that keyword density is no longer a ranking factor, even for on-page SEO, so I'm not surprised that adding keywords around Christmas trees didn't work, especially if older reviews already mentioned christmas trees or if someone already said they were fresh. I'd love to see data on a business adding brand new keywords(maybe the christmas tree service starts selling christmas ornaments, or wreathes, or trees for landscaping), or featuring reviews with location based keywords (maybe reviewers mention that even though they moved to [city an hour away], they still drive all the way back to [business location city] for this business). The thought also comes to mind that maybe hidden away in Google's algorithm somewhere, there's a weighting system for seasonal businesses that impacts the ranking weight on reviews depending on whether they were left on-season or off-season. Perhaps Google is automatically suspicious of reviews left in the off-season, but not enough to hide them. Or maybe Google only re-evaluates ranking on seasonal businesses when the season opens again. Truth be told, I have zero experience on seasonal businesses so that last bit could all just be nonsense ramblings.

Anyway, please don't take any of this confrontationally! I'm just not sold, but I respect you both and I really want to be sold lol.

I worked for Joy for 8 months and I know the level of detail that she puts into her studies. If you have ever seen her present at a virtual LocalU event and seen how she figured out how to beat Dave in Mario Cart, that is the same level of intensity that she puts into her data studies. I too have been focusing on online reviews since December 2016. I have invested a lot of time and energy into learning as much about online reviews as I can. I have monitored countless businesses that I reported for having fake reviews not losing any rankings after losing reviews. I have a love/ hate relationship with reviews. I love reviews, but I hate how people have perverted reviews. I wrote my post after hearing a "guru" tell people to reply to their reviews calling out the services and then upvoting the review. When I challenged it, I was told, Google may make it a ranking factor in the future. If reviews didn't have any ranking factors, how would your perception change? I wish the business that lost 600 reviews didn't lose them. They have several reviews that were posted for the wrong business mentioning payment advances. I would love to have seen if they would rank for payment services being a dog groomer. You are entitled to be a skeptic, that's fine. This is a case where we can agree to disagree. I take your post with no ill will.
 
However, I'm not sold on the assertion that there is a cap on the review count vs ranking impact
So just to be clear, I never said there was a cap. All that we concluded in the study I believe you're referring to (Does the Number of Google Reviews Impact Ranking? [Case Study] - Sterling Sky Inc) was that there seemed to be a bump at 10 and that that bump didn't happen again when they got to 30. I've heard people say there might be another bump at 100 so there could be other thresholds.
I have some reservations on that review keyword study as well.
The keyword study we did was the best I could possibly think of. A business with no viable SEO during a season where we were the only ones leaving reviews. It is very hard to test this in isolation without tons of fake listings. However, if you're skeptical, I would just say do your own studies and publish them - there is a big lack of that in our industry. Lots of opinions with very few things published :) This is part of why I publish lots of things we discover.
I'd love to see data on a business adding brand new keywords
We actually did a test before the Christmas tree farm on a fake listing I had set up using a keyword that didn't actually exist. We had people leave reviews talking about a "foliette restaurant" when that's actually not something that exists lol. It had no immediate impact either but I never published it because the listing got suspended shortly after the test was done so I didn't get to see how it looked months later. The issue I have with using brand-new keywords is the usefulness of it. I mean, if review text (not the review itself) didn't have enough of an impact to move the needle on something you already rank for, would it even be helpful even if it did help for something new? I mean, practically thinking, what business is going to actually benefit from that, even if it did have an impact? I'm sure there are cases where businesses fail to ever put something on their website or anywhere else on the listing, but I can't say I run across that often.
 
So just to be clear, I never said there was a cap. All that we concluded in the study I believe you're referring to (Does the Number of Google Reviews Impact Ranking? [Case Study] - Sterling Sky Inc) was that there seemed to be a bump at 10 and that that bump didn't happen again when they got to 30. I've heard people say there might be another bump at 100 so there could be other thresholds.

The keyword study we did was the best I could possibly think of. A business with no viable SEO during a season where we were the only ones leaving reviews. It is very hard to test this in isolation without tons of fake listings. However, if you're skeptical, I would just say do your own studies and publish them - there is a big lack of that in our industry. Lots of opinions with very few things published :) This is part of why I publish lots of things we discover.

We actually did a test before the Christmas tree farm on a fake listing I had set up using a keyword that didn't actually exist. We had people leave reviews talking about a "foliette restaurant" when that's actually not something that exists lol. It had no immediate impact either but I never published it because the listing got suspended shortly after the test was done so I didn't get to see how it looked months later. The issue I have with using brand-new keywords is the usefulness of it. I mean, if review text (not the review itself) didn't have enough of an impact to move the needle on something you already rank for, would it even be helpful even if it did help for something new? I mean, practically thinking, what business is going to actually benefit from that, even if it did have an impact? I'm sure there are cases where businesses fail to ever put something on their website or anywhere else on the listing, but I can't say I run across that often.

Hey Joy, on the first one, that was mostly just responding to one of Keyser's points where he referenced another big name, Mike Blumenthal. But thanks for commenting on it as well.

I have to get us to a point where we're comfortable posting publicly but I'm working on it and running tests and compiling case studies all the time! We've always kept our heads down but I'll keep nudging us in that direction. I completely agree on your point about how hard it is to test singular variables. I've been leaning more towards aggregate testing against a control group to try and weed out those x factors we can't control and it seems to be working pretty well. Then again, a test like this stretched across a bunch of profiles would be time intensive and risky. I'm sad that listing got suspended, but yet also a little reassured? The practicality argument is valid, but what about thematically relevant keywords that aren't explicitly referenced in site content? If a client was unwilling to pay for content on X longtail keyword, I wonder if getting it referenced in reviews could help. They could coach their customers on what keyword to reference and that new longtail keyword could be measured. Obviously, in a perfect world, they'd just pay us to write the content they need or roll up their sleeves and send it in, but that can be an uphill battle. Whereas, sending them a review template to send out with the right coaching in it could get the keyword to show up in reviews for free with minimal effort. A relevant example I can think of here is tree varieties. If their site doesn't include any specific tree types, then reviews referencing the specific species of trees they sell could be an interesting test. Anywho, I'll likely remain a skeptical curmudgeon for now, but I definitely want to run some testing or at least observational studies on it at some point. I will most definitely test location keywords in this realm sometime.
Side note, how long were most of those reviews? I'm curious as to whether there could be a certain character count threshold for getting review text to carry meaningful weight.
 
Hey Joy, on the first one, that was mostly just responding to one of Keyser's points where he referenced another big name, Mike Blumenthal. But thanks for commenting on it as well.

I have to get us to a point where we're comfortable posting publicly but I'm working on it and running tests and compiling case studies all the time! We've always kept our heads down but I'll keep nudging us in that direction. I completely agree on your point about how hard it is to test singular variables. I've been leaning more towards aggregate testing against a control group to try and weed out those x factors we can't control and it seems to be working pretty well. Then again, a test like this stretched across a bunch of profiles would be time intensive and risky. I'm sad that listing got suspended, but yet also a little reassured? The practicality argument is valid, but what about thematically relevant keywords that aren't explicitly referenced in site content? If a client was unwilling to pay for content on X longtail keyword, I wonder if getting it referenced in reviews could help. They could coach their customers on what keyword to reference and that new longtail keyword could be measured. Obviously, in a perfect world, they'd just pay us to write the content they need or roll up their sleeves and send it in, but that can be an uphill battle. Whereas, sending them a review template to send out with the right coaching in it could get the keyword to show up in reviews for free with minimal effort. A relevant example I can think of here is tree varieties. If their site doesn't include any specific tree types, then reviews referencing the specific species of trees they sell could be an interesting test. Anywho, I'll likely remain a skeptical curmudgeon for now, but I definitely want to run some testing or at least observational studies on it at some point. I will most definitely test location keywords in this realm sometime.
Side note, how long were most of those reviews? I'm curious as to whether there could be a certain character count threshold for getting review text to carry meaningful weight.

Both Joy and I referenced the same presentation from Mike B. She and I were at the same event. I feel that getting into review length is losing sight of her study and trying to get too much into the weeds. I have never seen review length matter, except for when a business got a bunch of star-only ratings. I can't single out if it was textless bases reviews or Google shifting ranking results. I am still sold on the fact that businesses and marketers have completely missed the point and significance of online reviews.
 
Side note, how long were most of those reviews? I'm curious as to whether there could be a certain character count threshold for getting review text to carry meaningful weight.

They were a few sentences long. Nothing crazy (some examples below). I have another article coming out shortly addressing review length. I spent about 18 months doing various tests and analyzing stuff so there are about 5-6 more articles coming out on reviews in the coming weeks.

Screenshot_20221107-182444.jpg


Screenshot_20221107-182424.jpg
 
They were a few sentences long. Nothing crazy (some examples below). I have another article coming out shortly addressing review length. I spent about 18 months doing various tests and analyzing stuff so there are about 5-6 more articles coming out on reviews in the coming weeks.

Screenshot_20221107-182444.jpg


Screenshot_20221107-182424.jpg

Ooh! I'll keep an eye out for them!
 
Hey there! I've gotten into an unfortunate timing with the LSRF where I publish it at the END of the year in November/December, and then January rolls around and it feels outdated already because it's "last years" version.

I am currently working on getting the updated survey prepared and out to participants in next couple of weeks. I want to get all the results in, the data crunched, and have it ready to publish early January. This way the 2023 LSRF is actually for the whole year, not just the end of the year. I'm skipping 2022 so that I can adjust this cadence.

Bump on this? Is it out yet? Any hints when if not?
 
The survey is now out to all the participants. It's a time-consuming survey, so I give them a couple of weeks to complete it. I aim to have all the data analyzed and published before the end of February.
 
Solution
The survey is now out to all the participants. It's a time-consuming survey, so I give them a couple of weeks to complete it. I aim to have all the data analyzed and published before the end of February.

how do you decide who the participants are?
 
how do you decide who the participants are?

When I notice people around the local search community that are publishing, speaking, and sharing their knowledge of local search, and it's clear that they really know their stuff, I add them to the list to invite.
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

Events

LocalU - Navigating GBP Support

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom