The hidden math of one missing star: what a rating drop costs a multi-location practice
A 4.6 average that slides to a 4.4 doesn't feel dramatic. The booking impact is more dramatic than the number looks. Here's the math.

A 0.2 drop in a Google rating average doesn't feel dramatic. A practice that's been sitting at 4.6 slides to 4.4 over a quarter, and nobody on the team notices because the number is still “good.” The patients still seem happy. The volume is still flowing.
Three months later the new-patient booking number is meaningfully softer and nobody can quite explain why. We can.
How patients actually use ratings
When someone searches “dentist near me”on Google Maps, they see a ranked list. The top three results — Google's Local Pack (often called the 3-Pack) — capture the overwhelming majority of clicks. BrightLocal's 2024 Local Consumer Review Survey puts the share of consumers who only consider Local Pack results when choosing a local business at roughly 76%.
Patients don't reason in absolutes. They reason in relative position to the alternatives. A 4.4 isn't bad — it's 0.3 stars worse than the practice right next to it on the screen. That delta makes the click rate on the practice's listing drop by roughly 20–30% in competitive zip codes, per observational rank-tracking data from Local Falcon and Whitespark's Local SEO Industry Survey. New-patient booking flow follows.
A worked example: 3-location dental group
Consider a three-location dental group doing 800 completed visits per month combined. They were converting roughly 6% of visits into reviews — about 48 new reviews per month, all things equal.
Over a quarter, two issues happen that get one-starred publicly: a billing dispute they couldn't resolve, and a long wait time that made a vocal patient angry. Neither was caught early enough to route privately. Both posted. Combined with the slow drip of 48 mostly-five-star reviews, the rating average drifts from 4.6 down to 4.4 over 14 weeks.
The visible impact:
- New-patient booking pages see about a 20% drop in conversion from listing-to-call across all three locations.
- If those locations were each generating 30 new patients per month from organic search, that's ~18 lost new patients per month total.
- Per the ADA Health Policy Institute patient-value benchmarks, average new-patient first-year value in general dentistry runs $500–$800 depending on payer mix. The practice is leaving $108,000–$172,800 per year on the floor.
The rating average — the number nobody on the team panicked about — was the leading indicator. The lost bookings are the lagging indicator that shows up in next quarter's P&L.
A second example: dermatology practice
The same math runs differently in specialties with higher patient lifetime value. A single-location dermatology practice doing 600 visits per month with a 4.5 rating drifts to 4.3 over a quarter. The booking drop is similar — roughly 20% — but the patient value is higher (averaging $1,200+ per new patient over two years in derm, per AAD practice benchmarks and MGMA cost-of-care reports). Twelve fewer new patients per month at that LTV is $172,800+ in annual revenue at risk from a single 0.2-star drift.
The six-month moving window
What makes rating drops particularly painful for healthcare is how slowly they recover. Google's local algorithm weights recent reviews heavily, but practices that were neglecting their review pipeline before the drop don't suddenly have a strong pipeline after it. The drift continues until something changes structurally.
The math of recovery: to lift a 4.4 average back to 4.6, a practice needs to post 30+ consecutive five-star reviews with minimal one- or two-star posts in the same window. At 5% conversion that takes the better part of a year. At 40% conversion it takes six weeks. (More on velocity vs total in why review velocity matters more than review count.)
What we'd protect first
If a multi-location practice asked us where to spend the first ninety days protecting bookings, we wouldn't start with paid search. We'd start with two things:
- Volume of recent five-star posts. The algorithm rewards freshness. A practice posting 15+ reviews per month per location is structurally hard to drift below its average.
- Private routing for anything below four stars. The two reviews that drove the drop in our example were both recoverable conversations. They became permanent public damage because nobody caught them in private first.
Those two changes don't require new technology in the practice. They require the review motion to leave the practice entirely and live somewhere built for it.
Want this kind of thinking applied to your practice?
Twenty minutes with us. We'll audit your current review velocity and tell you honestly whether applaud fits.

