Skip to main content
The Keyword

Maps

How reviews on Google Maps work

Article's hero media

When exploring new places, reviews on Google are a treasure trove of local knowledge that can point you to the places and businesses you’ll enjoy most — whether it’s a bakery with the best gluten-free cupcake or a nearby restaurant with live music.

With millions of reviews posted every day from people around the world, we have around-the-clock support to keep the information on Google relevant and accurate. Much of our work to prevent inappropriate content is done behind the scenes, so we wanted to shed some light on what happens after you hit “post” on a review.

An animated video about Google Maps reviews
10:25

How we create and enforce our policies

We’ve created strict content policies to make sure reviews are based on real-world experiences and to keep irrelevant and offensive comments off of Google Business Profiles.

As the world evolves, so do our policies and protections. This helps us guard places and businesses from violative and off-topic content when there’s potential for them to be targeted for abuse. For instance, when governments and businesses started requiring proof of COVID-19 vaccine before entering certain places, we put extra protections in place to remove Google reviews that criticize a business for its health and safety policies or for complying with a vaccine mandate.

Once a policy is written, it’s turned into training material — both for our operators and machine learning algorithms — to help our teams catch policy-violating content and ultimately keep Google reviews helpful and authentic.

Moderating reviews with the help of machine learning

As soon as someone posts a review, we send it to our moderation system to make sure the review doesn’t violate any of our policies. You can think of our moderation system as a security guard that stops unauthorized people from getting into a building — but instead, our team is stopping bad content from being posted on Google.

Given the volume of reviews we regularly receive, we’ve found that we need both the nuanced understanding that humans offer and the scale that machines provide to help us moderate contributed content. They have different strengths so we continue to invest tremendously in both.

Machines are our first line of defense because they’re good at identifying patterns. These patterns often immediately help our machines determine if the content is legitimate, and the vast majority of fake and fraudulent content is removed before anyone actually sees it.

Our machines look at reviews from multiple angles, such as:

  • The content of the review: Does it contain offensive or off-topic content?
  • The account that left the review: Does the Google account have any history of suspicious behavior?
  • The place itself: Has there been uncharacteristic activity — such as an abundance of reviews over a short period of time? Has it recently gotten attention in the news or on social media that would motivate people to leave fraudulent reviews?

Training a machine on the difference between acceptable and policy-violating content is a delicate balance. For example, sometimes the word “gay” is used as a derogatory term, and that’s not something we tolerate in Google reviews. But if we teach our machine learning models that it’s only used in hate speech, we might erroneously remove reviews that promote a gay business owner or an LGBTQ+ safe space. Our human operators regularly run quality tests and complete additional training to remove bias from the machine learning models. By thoroughly training our models on all the ways certain words or phrases are used, we improve our ability to catch policy-violating content and reduce the chance of inadvertently blocking legitimate reviews from going live.

If our systems detect no policy violations, then the review can post within a matter of seconds. But our job doesn’t stop once a review goes live. Our systems continue to analyze the contributed content and watch for questionable patterns. These patterns can be anything from a group of people leaving reviews on the same cluster of Business Profiles to a business or place receiving an unusually high number of 1 or 5-star reviews over a short period of time.

Keeping reviews authentic and reliable

Like any platform that welcomes contributions from users, we also have to stay vigilant in our efforts to prevent fraud and abuse from appearing on Maps. Part of that is making it easy for people using Google Maps to flag any policy-violating reviews. If you think you see a policy-violating review on Google, we encourage you to report it to our team. Businesses can report reviews on their profiles here, and consumers can report them here.

Phone featuring a selection of the reasons someone might report a Google review

Google Maps users and businesses can easily report reviews that they feel violate one of our policies.

Our team of human operators works around the clock to review flagged content. When we find reviews that violate our policies, we remove them from Google and, in some cases, suspend the user account or even pursue litigation.

In addition to reviewing flagged content, our team proactively works to identify potential abuse risks, which reduces the likelihood of successful abuse attacks. For instance, when there’s an upcoming event with a significant following — such as an election — we implement elevated protections to the places associated with the event and other nearby businesses that people might look for on Maps. We continue to monitor these places and businesses until the risk of abuse has subsided to support our mission of only publishing authentic and reliable reviews. Our investment in analyzing and understanding how contributed content can be abused has been critical in keeping us one step ahead of bad actors.

With more than 1 billion people turning to Google Maps every month to navigate and explore, we want to make sure the information they see — especially reviews — is reliable for everyone. Our work is never done; we’re constantly improving our system and working hard to keep abuse, including fake reviews, off of the map.

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe