DEV Community

Cover image for Scrape Google Maps data and reviews using Python
hil for SerpApi

Posted on • Edited on • Originally published at serpapi.com

Scrape Google Maps data and reviews using Python

Learn how to quickly and effortlessly scrape Google Maps places data and its reviews using the SerpApi Python library. Bonus: export the data to a CSV file.

If you need information from Google Maps, like place details or reviews, you are in the right place! Scraping this data is valuable whether you're looking to scout locations or gather insights on local businesses. Let's learn how to scrape Google Maps places data and its reviews using Python and simple API by SerpApi.

Google Maps scraper illustration

How do you scrape Google Maps data results?

There are at least three ways to do this:

- Using SerpApi (Recommended)

- Places API by Google

- Create your DIY Google Maps scraper solution

  1. Using SerpApi (Recommended)

SerpApi provides a nice structured JSON with all the relevant information we need from the local Maps results. It also offers complete data like reviews and photos. So you can save time and energy to collect data from Google Maps without building your own Google Maps Scraper.

Link: Google Maps API Documentation.

This blog post covers precisely how to scrape Google Maps data using SerpApi.

Google Maps scraper response example.

  1. Using places API by Google

We can use the places API by Google. But first, we must set up a Google Cloud project and complete the setup instructions before getting the API Key. We can then use the HTTP Post request or the Python SDK to perform a search.

Con: The initial setup is complicated to get the API key.

  1. Create your DIY Google Maps scraper solution.

Google Maps load their data dynamically. To scrape this, we need to use a tool like Puppeteer to scrape Javascript-rendered websites properly. We can use Selenium, pyppeteer, or Playwright-python package to run a headless browser. After that, we can start parsing the relevant data from Google Maps.

Con: Building our web scraper is time-consuming, and we'll face many challenges, like getting blocked, setting up multiple proxies, and many more!

Scrape Google Maps Data and Reviews with Python Video Tutorial

If you prefer to watch a video tutorial, here is our YouTube video on quickly scraping Google Maps with a simple API.

Step-by-step scraping Google Maps data with Python

Without further ado, let's start and collect data from Google Maps.

Step 1: Tools we're going to use

We'll use the new official Python library by SerpApi: serpapi-python .

That's the only tool that we need!

Python library from SerpApi, the tool we need to scrape Google SERP

As a side note: You can use this library to scrape search results from other search engines, not just Google.

Usually, you'll write your DIY solution using something like BeautifulSoup, Selenium, Selenium, Puppeteer, Requests, etc., to scrape Google Maps. You can relax now since we perform all these heavy tasks for you. So, you don't need to worry about all the problems you might've encountered while implementing your web scraping solution.

Step 2: Setup and preparation

  • Sign up for free at SerpApi. You can get 100 free searches per month.
  • Get your SerpApi Api Key from this page.
  • Create a new .env file, and assign a new env variable with value from API_KEY above. SERPAPI_KEY=$YOUR_SERPAPI_KEY_HERE
  • Install python-dotenv to read the .env file with pip install python-dotenv
  • Install SerpApi's Python library with pip install serpapi
  • Create new main.py file for the main program.

Your folder structure will look like this:



|_ .env
|_ main.py



Enter fullscreen mode Exit fullscreen mode

Step 3: Write the code for scraping basic Google Maps result

Let's say we want to find places for pizza keyword in New York. This API needs an ll parameter, which is the latitude and longitude of an area. So, I'm going to use a free online tool to find the ll for a certain place.

I'm using https://www.latlong.net/ for this. Just type a city name or area, and it will return the latitude and longitude numbers. We'll combine these numbers and separate them with a comma sign.

Example for New York:

lat: 40.712776

long: -74.005974

So the ll value will be @40.712776,-74.005974

Here is the complete Python code:



import os
import serpapi

from dotenv import load_dotenv
load_dotenv()
api_key = os.getenv('SERPAPI_KEY')

client = serpapi.Client(api_key=api_key)
results = client.search({
    'engine': 'google_maps',
    'q': 'pizza',
    'll': '@40.7455096,-74.0083012,15.1z',
    'type': 'search',
})

print(results)


Enter fullscreen mode Exit fullscreen mode

Try to run this program with python main.py or python3 main.py from your terminal.

Feel free to change the value of theq parameterwith any keyword you want to search for.

The local results are available at result['local_results'] or result['place_results'].

Difference between place results and local results

Our Google Maps API supports two types of search. The default search type is search, which will return an array of results inside the local_results key:

https://serpapi.com/playground?engine=google_maps&q=Coffee&ll=%4040.7455096%2C-74.0083012%2C14z&hl=en&type=search

The other type is place. The type can be set manually to place, alongside the data parameter to provide details of a specific location or business. This type of search returns place_results.

https://serpapi.com/playground?engine=google_maps&q=Coffee&ll=%4040.7455096%2C-74.0083012%2C14z&hl=en&data=!4m5!3m4!1s0x89c259b7abdd4769%3A0xc385876db174521a!8m2!3d40.750231!4d-74.004019&type=place

If you perform a search with type=search for a very specific location, i.e you provide the full address, Google will infer type=place and return place_results for that location:

https://serpapi.com/playground?engine=google_maps&q=525+W+26th+St%2C+New+York%2C+NY+10001&ll=%4040.7455096%2C-74.0083012%2C14z&hl=en&type=search

More broadly, local_results are a list that is provided when the search is more general. On the other hand place_results are details of a specific place that are provided when the query is very specific, or you use place_id or data and typ=place to get results for a specific location.

Available data

There are a lot of data we can get from this API, for example:

- title

- GPS coordinates

- reviews summary

- average rating

- price

- type

- address

- operating hour information

- phone

- website

- service option

- etc.

Screenshot of Google Maps places detail

You can assume all the data when you select a particular result on maps.google.com are available on our API response.

Paginate Google Maps results

Based on Google Search Maps Results API documentation, we can get the second, third page, and so on, using the start parameter. By default, Google Maps returns 20 results per page, and the value for the start parameter is 0. We don't have to provide this information to get the first-page result.

Here is an example to get a second page:



results = client.search({
    'engine': 'google_maps',
    'q': 'pizza',
    'll': '@40.7455096,-74.0083012,15.1z',
    'type': 'search',
    'start': 20
})


Enter fullscreen mode Exit fullscreen mode

For the third page



results = client.search({
    'engine': 'google_maps',
    'q': 'pizza',
    'll': '@40.7455096,-74.0083012,15.1z',
    'type': 'search',
    'start': 40
})


Enter fullscreen mode Exit fullscreen mode

So, you need to increase the start page by 20 for each page. Here is how you can do this programmatically (to scrape all the results):



client = serpapi.Client(api_key=api_key)

start = 0

while True:
    results = client.search({
        'engine': 'google_maps',
        'q': 'pizza',
        'll': '@40.7455096,-74.0083012,15.1z',
        'type': 'search',
        'start': start
    })

    # If no local_results key or if it's empty, break out of the loop.
    if 'local_results' not in results:
        print('No more local results')
        break

    start += 20  # Get the next page of results.
    print(len(results['local_results']))  # Print the number of job results.


Enter fullscreen mode Exit fullscreen mode

​We recommend a maximum of 100 (page six) which is the same behavior as with the Google Maps web app. More than that, the result might be duplicated or irrelevant.

Exports Google Maps results to CSV.

What if you need the data in csv format ? You can add the code below. This code sample shows you how to store all the local_results in the CSV file. For this example, we're saving the title, address, phone, and website.



client = serpapi.Client(api_key=api_key)
results = client.search({
    'engine': 'google_maps',
    'q': 'pizza',
    'll': '@40.7455096,-74.0083012,15.1z',
    'type': 'search',
})

local_results = results['local_results']

with open('maps-results.csv', 'w', newline='') as csvfile:
    csv_writer = csv.writer(csvfile)

    # Write the headers
    csv_writer.writerow(["Title", "Address", "Phone Number", "Website"])

    # Write the data
    for result in local_results:
        csv_writer.writerow([result["title"], result["address"], result["phone"], result["website"] if "website" in result else "" ])

print('Done writing to CSV file.')


Enter fullscreen mode Exit fullscreen mode

Since the "website" key is not always available, we store it with a conditional statement.

Here are the results in a CSV file:

Google Maps results are exported to a CSV file

How to get Google Maps Reviews

SerpApi also provides a Google Maps Reviews API to get all the details of reviews from a specific place.

First, we need to get either a place_id or data_id , which is available in our initial response on each item.

For simplicity's sake, we'll be using place_id in this post. My teammate, Ryan, wrote another blog post on how to structure a data_id for Google Maps in this post: https://serpapi.com/blog/scraping-business-reviews-from-google-maps-with-serpapi/#getting-the-dataid-with-a-placeid-or-vice-versa.

Here is the code sample for how to scrape the Google Maps reviews data:



client = serpapi.Client(api_key=api_key)
results = client.search({
    'engine': 'google_maps_reviews',
    'type': 'search',
    'place_id': 'ChIJN1t_tDeuEmsRUsoyG83frY4',
})

print(results)



Enter fullscreen mode Exit fullscreen mode

Remember to replace the place_id with the id of the place, you want to look for.

The response will include the review's link, rating, user detail, snippet, and amount of likes. You can sort the rating from the highest or lowest using sort_by parameter.

How to paginate all the reviews from Google Maps?

We can get all the reviews data by paginating the search using next_page_token value that is available from each of Google Maps reviews API responses.



client = serpapi.Client(api_key=api_key)
results = client.search({
    'engine': 'google_maps_reviews',
    'type': 'search',
    'place_id': 'ChIJN1t_tDeuEmsRUsoyG83frY4',
    'next_page_token': "VALUE_FROM_PREVIOUS_RESPONSE"
})

print(results)



Enter fullscreen mode Exit fullscreen mode

Make sure to update the next_page_token value with the original value you're getting from the previous response.

That's how you can scrape Google Maps data and places reviews.

If you're interested in scraping Google search, feel free to read: How to scrape Google search results with Python .

FAQ

Is it legal to scrape Google Maps Data?

Scraping publicly accessible data is legal in the U.S., including scraping Google Maps.

"SerpApi, LLC promotes ethical scraping practices by enforcing compliance with the terms of service of search engines and websites. By handling scraping operations responsibly and abiding by the rules, SerpApi helps users avoid legal repercussions and fosters a sustainable web scraping ecosystem." - source: Safeguarding Web Scraping Activities with SerpApi.

How to scrape Google Maps without getting blocked?

  1. Use Proxies: Rotate multiple IP addresses to prevent your main IP from being blocked. This makes it harder for Google to pinpoint scraping activity from a single source.
  2. Set Delays: Don't send requests too rapidly. Wait a few seconds between requests to mimic human behavior and avoid triggering rate limits.
  3. Change User Agents: Rotate user agents for every request. This makes it seem like the requests come from different devices and browsers.
  4. Use CAPTCHA Solving Services: Sometimes, if Google detects unusual activity, it will prompt with a CAPTCHA. Services exist that can automatically solve these for you.

While these methods can help when scraping manually, you don't have to worry about rotating proxies, setting delays, changing user agents, or solving CAPTCHAs when using SerpApi. It makes getting search results easier and faster since we will solve all those problems for you.

What does Google Maps API cost?

The official Google Maps API varies depending on your usage https://mapsplatform.google.com/pricing/. While at SerpApi, you can use all APIs, including Google Maps, with a monthly subscription. You'll get 100 free search credits per month. Optionally, the developer plan starts with 50$.

Why scrape Google Maps?

Scraping Google Maps data can be beneficial for several reasons:

  1. Competitor Analysis: Businesses use Google Maps scrapers to collect data on competitors' locations, customer reviews, and ratings, which helps in understanding the market and strategizing accordingly​​.
  2. Lead Generation: Google Maps is a valuable lead source, as local and online businesses list their data on the platform to attract customers. Marketers can use scraped data from Google Maps to boost sales by targeting potential customers effectively​​.
  3. Information Database: Google Maps is a central hub for information on various locations like restaurants, shops, service providers, and institutions. Scraping this information can help in building databases for different purposes​​.
  4. Market Research: The insights gained from Google Maps scraping can be used for market research, further understanding customer behavior and preferences, and developing targeted marketing strategies​​.

That's it!

I hope this blog post can help you to collect any place data from Google Maps. Thank you for reading!

Top comments (0)