More threads by LightOfHeaven

LightOfHeaven

Member
Joined
Sep 18, 2025
Messages
7
Reaction score
1
We spent the last few months tracking local rankings across multiple industries beauty, medical spas, senior care, retail. I’m not talking about just checking rank once a day we measured at dozens of geographic points per keyword, across multiple distance rings from each business, almost every single day/hours. Some of our engagements signals were from both logged in and non logged in users as well as a few unicorn accounts sprinkled here and there.

The response window is 24-48 hours.

First visible movement appears at T+1. Peak response at T+2. By T+3 the signal is either being incorporated into the rolling evaluation or it's being discarded as noise. Engagements count only once per device or accounts

The algorithm smooths every 3-4 days. Here's why that matters.


We noticed a clear pattern: rankings would improve, hold for a couple days, then partially drop even when engagement was consistent. This happens like clockwork every 3-4 days!!

Our theory: Google knows that businesses get random engagement spikes. someone runs a weekend sale, gets featured in a local group, or has a busy Saturday. The algorithm can't treat every spike as permanent it would make rankings way too crazy and not aligned with their ”quality results” standards. every few days, it re-evaluates the competitive ranks and smooths out the noise, and recalculates.

If your engagement survives the smoothing window, the gain sticks. If it was a one-time spike, it gets erased. This means consistency matters more than volume. Steady daily engagement that survives multiple smoothing cycles compounds over time. A blast of activity followed by silence gets wiped out every time. Which I am not a fan of because most people just blindly select the first results, which Google obviously know about. Look at the Ads positionings nowadays

Not all engagement signals carry the same weight.
1771649273944.webp
We tracked every type of interaction separately business clicks, website visits ( with a and without much dwell time), direction requests, phone calls, photo views, review scrolling, shares
Clear to me: high-intent signals move rankings faster. Someone requesting directions to your business or calling you from the listing is worth significantly more than someone scrolling through your photos. Website click-throughs are also strong.

The low-effort signals (photo views, review reading) still contribute but more as support. They make the overall engagement pattern look natural, and they add up in volume. You need both.

Website clicks are the most consistent signal across all distances. They perform roughly the same whether someone is searching from 1 mile away or 5 miles away. This makes website engagement the most reliable tool for building visibility across a wide area.

Direction requests and phone calls have the strongest ranking impact in the 3+miles zone away effectiveness drops significantly at mid-range and marginal at distance. I guess Google expects direction requests to come from people who are reasonably Far to the business. A direction request from 1-2 miles away doesn't carry the same signal weight as far away.

Shares behave in the same sense as direction calls and directions. They have a disproportionate impact on expansion and reach zones (3+ miles). Shares from users near or far from the business expand the ranking radius more effectively than any other signal type But I consider that an aggressive approach and can be easily spotted by the algorithm.

We've also observed that certain high-intent signals can become counterproductive at extreme distances. Spike in direction request from 8 miles away on a neighborhood business can actually correlate with negative ranking movement in that zone ( Usually short live). The signal doesn't match the expected behavior pattern, and Google seems to discount it but right back up to it's normal rank

The practical takeaway: the engagement mix should shift based on what zone you're trying to improve. Weak in the core zone? Focus on direction requests ( Walking /biking ) and calls. Weak in expansion and reach? Focus on slight number shares and website clicks and driving directions. Always keep Dwell time in mind. Running the same signal mix everywhere is suboptimal.

Reviews alone cannot hold position 1.

We tracked businesses with strong review profiles that were losing ranks, and businesses with fewer reviews that were climbing. The businesses holding top positions weren't just the ones with the most or best reviews they were the ones consistently generating the engagement Google expects for that position. Google seems to have an implicit "engagement threshold" for each rank. If you're position 1 but your engagement drops below what a P1 business typically gets in your category, Google starts testing other businesses there.

Reviews set your floor they determine how low you can fall. Clean engagement sets your ceiling how high you can reach and hold.

For review velocity specifically, our data suggests a slow steady pace significantly outperforms bursts. Think 1-2 reviews every few 3-4 days rather than 10 in a week. Velocity matters, but so does looking natural especially since the guidelines were recently updated again. Buying reviews is not something I’d recommend.

The climb from position 10 is real but has a pattern.


Consistent engagement from position 10 produces measurable movement to position 5-6 within two weeks. 5 to 3 takes the same effort as 10 to 5. From 3 to 1 requires outpacing businesses that are generating real organic engagement at scale. This is where most campaigns stall the engagement that moved you from 10 to 5 isn't sufficient top whoever's at 2 long term or post CTR boosts.

Demographic alignment is a signal most people don't even know exists.
1771649227668.webp
Google has census data. It knows the demographic composition of every zip code your business serves. When the engagement profile of your listing doesn't match the expected demographic distribution language, device characteristics, time-of-day patterns that's a a weird signal and based on the nav boost leaks we know all about what they track from our devices
Matching engagement demographics to the actual population of your service area isn't a nice-to-have. It's the difference between signal that survives re-evaluation and signal that gets filtered as inorganic aka “ badClicks” from the leaks. The businesses that sustain gains are the ones producing engagement that's indistinguishable from organic behavior at the pattern level timing, frequency, demographic fingerprint, geographic spread. So if you engage in any CTR boosting platforms that's something to keep in mind. We're talking about Timezone, language, keyboard, user persona,apps installs and much more...they need to be as unique as possible

Extra things we've seen in practice (5,000+ sessions):

What's interesting is that different types of engagement drove results in different industries:
  • Healthcare services provider (1,100+ sessions): Competitive keywords climbed 7-10 positions. Several went from invisible to top-3 on the grid. The biggest driver was a combination of profile engagement, website visits, and direction requests. Once the business started generating consistent intent signals (not just views, but actions that indicate someone is about to visit), the rankings moved dramatically. Shares also played a role in expanding visibility beyond the immediate area.

  • Medical aesthetics practice (800+ sessions): Primary local keyword moved from position 16 to position 6. Specialty terms climbed 5-10 positions. Here, high-intent signals like direction requests and website clicks had a disproportionate impact on ranking movement compared to overall volume. A smaller number of the right signals outperformed a larger number of low-intent ones.

  • Beauty services provider in a competitive metro (1,000+ sessions): "Near me" keywords improved 4-5 positions, with multiple terms entering the top 5. The standout finding here was that shares from users across a wide geographic area correlated directly with the listing appearing in results further from the business location. The ranking radius expanded as geographic engagement diversity increased
    .

    1771649407231.webp


    1771649548251.webp
  • Spa already in the top 3 (200+ sessions): Maintained position and improved from 3.6 to 2.0 average on its primary keyword. At this stage, minimal engagement was actually better than more. Light profile clicks and an occasional direction request held and improved the position. Over-engaging a dominant listing adds risk without adding benefit.

  • Local retail business (1,000+ sessions): Achieved position 1 with full grid visibility across branded and category terms. A balanced mix of engagement types built a dominant presence. Once established, activity was scaled back and the position held. The algorithm appears to build "trust" for listings that demonstrate consistency over extended periods.

  • Service business in a hyper-competitive urban market (1,000+ sessions): Mixed results. Some keywords climbed 5-7 positions, others stayed flat or dipped slightly. This was the most instructive case: in markets where top competitors have genuine organic foot traffic ( No search perform) but gps signals and customer activity, engagement signals alone can't overcome that organic base. You need both the engagement foundation and real-world customer activity to break through.
Personally I have not observed any direct manual penalties or filtering from engagement signals even super aggressive ones. Using 2 separate businesses I sent over 500 calls,300 website clicks, just 1 direction on both, it's been 3 months now and still nothing weird going on with them. no filtering nor major drops in ranks

The algorithm appears always be fixing itself?? rather than punitive. It doesn't ban you ( Just think about how easy it would be for someone one to destroy a competitor overnight? or a week? ) It just stops counting signals that don't pass its pattern validation during the next smoothing cycle. Your rankings quietly slide back to where they were, and most people never understand why.

Added some cool screen grabs below and how rankings shift during day and hours across a business
1771649342262.webp

Feel free to ask me anything! Are there any tests you’d like me to run?

In the meantime, here are a few things I’m currently working on:
  • AI answers for local businesses what it takes to influence them. Across 186 keywords tested, 35 had AI Overviews that included local business results.
  • How many devices are needed over a 7-day span to influence the “Usually Busy” feature.
  • The level of engagement required to keep SAB (Service Area Business) listings in the top 10.



1771649448996.webp
 

Login / Register

Already a member?   LOG IN
Not a member yet?   REGISTER

Events

GMBAPI
Back
Top Bottom