More threads by mikepetersonwi

Joined
Nov 7, 2018
Messages
77
Reaction score
11
I've always enjoyed reading what others have to say with regard to Local ranking factors. I've especially enjoyed the survey results most recently published by Whitespark (2020 Local Search Ranking Factors | Whitespark).

My question is, where do I look for the most recent survey results? If there hasn't been anything updated since then, are there other resources, reports, or studies that can confirm what was last reported is still somewhat accurate?

Thanks in advance.
 
Solution
The survey is now out to all the participants. It's a time-consuming survey, so I give them a couple of weeks to complete it. I aim to have all the data analyzed and published before the end of February.
Hey there! I've gotten into an unfortunate timing with the LSRF where I publish it at the END of the year in November/December, and then January rolls around and it feels outdated already because it's "last years" version.

I am currently working on getting the updated survey prepared and out to participants in next couple of weeks. I want to get all the results in, the data crunched, and have it ready to publish early January. This way the 2023 LSRF is actually for the whole year, not just the end of the year. I'm skipping 2022 so that I can adjust this cadence.

Good stuff. So you don't have a skipped year you could average? the two and call it an estimate etc.

Keen to see what the feedback is on key words in reviews are. I've had experience to the contrary of a recent post.
 
Good stuff. So you don't have a skipped year you could average? the two and call it an estimate etc.

Keen to see what the feedback is on key words in reviews are. I've had experience to the contrary of a recent post.
Reviews and NAP don’t hold the same weight as they used to. If it’s know the review study that you are referring too, then you can take that data to the bank. If you would like to refute it, then I would be open to reading your study on it. I want to set the standards on how the study needs to be conducted. You can’t use an exact match GBP of the terms you want to target. If you are targeting a geographic term, it can’t be part of the business name nor can the industry or similar keywords be used. The business has to be established with a minimum of 20 reviews prior to the new reviews being added with the target keywords. I want to see a Brightlocal and PlacesScout before and after study. No other work can be done, it must only be reviews. Do you accept my challenge?
 
Reviews and NAP don’t hold the same weight as they used to. If it’s know the review study that you are referring too, then you can take that data to the bank. If you would like to refute it, then I would be open to reading your study on it. I want to set the standards on how the study needs to be conducted. You can’t use an exact match GBP of the terms you want to target. If you are targeting a geographic term, it can’t be part of the business name nor can the industry or similar keywords be used. The business has to be established with a minimum of 20 reviews prior to the new reviews being added with the target keywords. I want to see a Brightlocal and PlacesScout before and after study. No other work can be done, it must only be reviews. Do you accept my challenge?

"Reviews and NAP don’t hold the same weight as they used to." that could now be the case. But hasn't been my experience most of this year. But with only a few reviews a month it will be a while before my perceptions change.

I'd also prefer the business name not to have so much weighting on it as well. That could have changed last few weeks or so.

I read your comments like there's been a very recent update to the weight on Names and Review words?
 
"Reviews and NAP don’t hold the same weight as they used to." that could now be the case. But hasn't been my experience most of this year. But with only a few reviews a month it will be a while before my perceptions change.

I'd also prefer the business name not to have so much weighting on it as well. That could have changed last few weeks or so.

I read your comments like there's been a very recent update to the weight on Names and Review words?

When I said NAP I was meaning NAP consistency. People put too much stock in NAP consistency and citation clean-up services. Does it matter if your name, address, and phone number are wrong on a small citation platform that nobody uses or visits? No. Will Google penalize your rankings over it? No. People should focus on the main ones, GBP, Yelp, FB, Yahoo, Bing, and Apple Maps is eligible. I wasn't talking about keywords in the business name.

My comments weren't about a new update, it was more about how people are believing old outdated myths.
 
When I said NAP I was meaning NAP consistency. People put too much stock in NAP consistency and citation clean-up services. Does it matter if your name, address, and phone number are wrong on a small citation platform that nobody uses or visits? No. Will Google penalize your rankings over it? No. People should focus on the main ones, GBP, Yelp, FB, Yahoo, Bing, and Apple Maps is eligible. I wasn't talking about keywords in the business name.

My comments weren't about a new update, it was more about how people are believing old outdated myths.

Cool, thank you.
Edit;
"GBP, Yelp, FB, Yahoo, Bing, and Apple Maps"

What's on Yahoo we should consider?
 
Reviews and NAP don’t hold the same weight as they used to. If it’s know the review study that you are referring too, then you can take that data to the bank. If you would like to refute it, then I would be open to reading your study on it. I want to set the standards on how the study needs to be conducted. You can’t use an exact match GBP of the terms you want to target. If you are targeting a geographic term, it can’t be part of the business name nor can the industry or similar keywords be used. The business has to be established with a minimum of 20 reviews prior to the new reviews being added with the target keywords. I want to see a Brightlocal and PlacesScout before and after study. No other work can be done, it must only be reviews. Do you accept my challenge?

Coming in hot I see lol
 
So just to be clear, I never said there was a cap. All that we concluded in the study I believe you're referring to (Does the Number of Google Reviews Impact Ranking? [Case Study] - Sterling Sky Inc) was that there seemed to be a bump at 10 and that that bump didn't happen again when they got to 30. I've heard people say there might be another bump at 100 so there could be other thresholds.

The keyword study we did was the best I could possibly think of. A business with no viable SEO during a season where we were the only ones leaving reviews. It is very hard to test this in isolation without tons of fake listings. However, if you're skeptical, I would just say do your own studies and publish them - there is a big lack of that in our industry. Lots of opinions with very few things published :) This is part of why I publish lots of things we discover.

We actually did a test before the Christmas tree farm on a fake listing I had set up using a keyword that didn't actually exist. We had people leave reviews talking about a "foliette restaurant" when that's actually not something that exists lol. It had no immediate impact either but I never published it because the listing got suspended shortly after the test was done so I didn't get to see how it looked months later. The issue I have with using brand-new keywords is the usefulness of it. I mean, if review text (not the review itself) didn't have enough of an impact to move the needle on something you already rank for, would it even be helpful even if it did help for something new? I mean, practically thinking, what business is going to actually benefit from that, even if it did have an impact? I'm sure there are cases where businesses fail to ever put something on their website or anywhere else on the listing, but I can't say I run across that often.

Hey Joy, on the first one, that was mostly just responding to one of Keyser's points where he referenced another big name, Mike Blumenthal. But thanks for commenting on it as well.

I have to get us to a point where we're comfortable posting publicly but I'm working on it and running tests and compiling case studies all the time! We've always kept our heads down but I'll keep nudging us in that direction. I completely agree on your point about how hard it is to test singular variables. I've been leaning more towards aggregate testing against a control group to try and weed out those x factors we can't control and it seems to be working pretty well. Then again, a test like this stretched across a bunch of profiles would be time intensive and risky. I'm sad that listing got suspended, but yet also a little reassured? The practicality argument is valid, but what about thematically relevant keywords that aren't explicitly referenced in site content? If a client was unwilling to pay for content on X longtail keyword, I wonder if getting it referenced in reviews could help. They could coach their customers on what keyword to reference and that new longtail keyword could be measured. Obviously, in a perfect world, they'd just pay us to write the content they need or roll up their sleeves and send it in, but that can be an uphill battle. Whereas, sending them a review template to send out with the right coaching in it could get the keyword to show up in reviews for free with minimal effort. A relevant example I can think of here is tree varieties. If their site doesn't include any specific tree types, then reviews referencing the specific species of trees they sell could be an interesting test. Anywho, I'll likely remain a skeptical curmudgeon for now, but I definitely want to run some testing or at least observational studies on it at some point. I will most definitely test location keywords in this realm sometime.
Side note, how long were most of those reviews? I'm curious as to whether there could be a certain character count threshold for getting review text to carry meaningful weight.
 
They were a few sentences long. Nothing crazy (some examples below). I have another article coming out shortly addressing review length. I spent about 18 months doing various tests and analyzing stuff so there are about 5-6 more articles coming out on reviews in the coming weeks.

Screenshot_20221107-182444.jpg


Screenshot_20221107-182424.jpg

Ooh! I'll keep an eye out for them!
 
The survey is now out to all the participants. It's a time-consuming survey, so I give them a couple of weeks to complete it. I aim to have all the data analyzed and published before the end of February.

how do you decide who the participants are?
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

LocalU Event

LocalU Webinar

  Promoted Posts

New advertising option: A review of your product or service posted by a Sterling Sky employee. This will also be shared on the Sterling Sky & LSF Twitter accounts, our Facebook group, LinkedIn, and both newsletters. More...
Top Bottom