Extract URLs from Google’s Web SERPs

Every now and then you may want to extract a list of URLs from a Google web search for a particular search query.

The most common reason for this (in my experience at least) is to obtain a list of all URLs which Google has indexed for your particular domain. This process itself can be useful for a number of reasons from analysing the visible Titles and Meta Descriptions to searching for indexation of rogue or redundant URLs.

Power users and webmasters will know that it is difficult to get a definitive list of indexed URLs directly from Google. Google Search Console (previously Webmaster Tools) offers an ‘Index Status’ which provides insight into the number of URLs indexed, historic trends and various filters. They also provide insight into the proportion of URLs index from any submitted sitemaps. But neither actually provide a definitive list of the URLs which Google has indexed for your domain.

You can of course extract the data manually using the ‘site:’ search operator and copy/pasting the results. That’s fine if you’re operating a relatively small number of URLs but trying to adopt this approach for any more than 10 results can become somewhat tiresome – even extracting 10 results manually can be a bore!

So what if I told you it’s possible to extract a list of URLs from SERPs with a few clicks? You wouldn’t believe me, right? Wrong! With this little bookmarklet which I originally adapted for High Position from a similar tool by Liam Delahunty you’ll be able to extract URL and anchor text information with minimal effort.

How to Extract Google’s Web Search URLs

I’ll start by saying there is nothing magic or malicious about this approach. We’ll be utilising a JavaScript bookmarlet to process the search results provided by Google in combination with a nifty Chrome plugin to seamlessly scroll multiple pages of search results.

Here’s how to do it.

  1. Use Chrome. If you’re not using Chrome you can download it here. If you’d prefer not to use Chrome the bookmarlet itself will work with other browsers, but we’ll be using a plugin not available in other browsers (to my knowledge!).
  2. Install the ginfinity plugin for Chrome. This will un-restrict the number of search results per page by seamlessly appending the next page of search results to the current list of SERPS – in essence creating infinite scrolling of SERPs.
  3. Go to Google and perform your search query. If you are extracting URLs for your domain use the Site search operator e.g. ‘site:chrisains.com’.
  4. If you’re working with a large website with hundreds of URLs you’ll probably benefit from increasing the number of search ‘Results Per Page’. By default Google is set to return 10 results but you can override this to a maximium of 100. To do this:
    Google Search Setting
    This will limit the number of queries against Google search results. Let’s say for example that your domain has 200 pages indexed. At 10 results a page you would need to query Google 20 times, but at 100 results a page you’ll only query Google twice. This will limit the chances of warning messages about ‘unusual traffic from your computer network’ which you can receive if persistently query Google.

    Google Unusual Traffic Captcha

  5. Next, go back to the Google search results page for your particular query. If you are indeed querying results of over 100 you should see gInfinity append the next page of search results to the current page:

    ginfinity Example

    Keep scrolling until you have a single page containing all search results for your query.

  6. Drag and drop the following bookmarklet to your ‘Bookmarks’ Toolbar in Chrome (you can also do this with most modern browsers).
  7. Once you’ve placed the bookmarklet in your toolbar, and making sure you have your list of Google SERPS in front of you, click on the bookmarlet. The JavaScripted based bookmarklet will open in a new window and display all URLs and anchor texts in a list
  8. Mission Complete!

That’s it. You should now be able to extract a list of all website URLs indexed within Google for your particular domain.

Please bear in mind that Google are continually modifying the way in which they display search results. This means that the coding used to create the bookmarklet may cease to function as and when Google release changes. When this occurs I’ll try my best to update the code but if anyone notices it not working please let me know!

The Extraction Bookmarklet Code

In case anyone wants to adapt the code here is the JavaScript code which I used to create the bookmarklet.

	output='<html><head><title>SEO SERP Extraction Tool</title><style type=\'text/css\'>body,table{font-family:Tahoma,Verdana,Segoe,sans-serif;font-size:11px;color:#000}h1,h2,th{color:#405850}th{text-align:left}h2{font-size:11px;margin-bottom:3px}</style></head><body>';




		if(pageAnchors[i].parentNode.getAttribute('class')!='_Rm bc'){

Thanks for reading.

23 thoughts on “Extract URLs from Google’s Web SERPs

  • Hey Chris

    this was great info.

    I request you to help me with the knowledge more on
    How to exactly use the extractor bookmarker code ?

    I will wait for your kind help.


    • Hi Jitesh,

      If you’re using/adapting the code provided (as opposed to using the pre-made bookmarklet provided) you’ll simply need to create a bookmark in a browser of your choice and enter the source code as the trigger URL.

      If you need any more information please let me know.



  • Omg Chris, this works great, you have no idea how much work you saved me!

    There are so many shady companies trying to get you to install their plugins to do something so simple (while catching your email and hopefully your purse too) and then it’s that easy.

    Big thumbs up for giving this away for free, you’re my hero! πŸ™‚

    • Hi Manuel,

      Can you provide any further information at all? I have just tried the bookmarlet via various search criteria on various Google language (i.e. Google.ca, Google.com.au etc) and it seems to work fine?

      Thanks in advance.


  • Hello Chris,

    I am trying to get all urls into a txt or array from google search . My code does the search part but i also need urls . Could you help me about this ? Here is my code:

    Google Web Search API


    Thank you.

  • Wow this saved me a LOT of time!

    I’m a bit confused however about how Google is adding thing up. When I site:www.notmyrealsite.com it says “About 802 results”.

    I click through copying each page of 10 results (I left results at 10 to be sure what was going on) and when I get to page 19 and it says “Page 19 of 187 results”.

    I copy/pasted links from each page into excel and got 200 results.

    So what’s going on? Is that “About 802 results” an unreliable number? I was feeling good about 800 or so link to my site, to find out it was only 200 pfft…

    To try and validate the numbers above I went to Google Search Console > Search Traffic > Internal Links > Gives me 307 results.

    I’m migrating an old website to a new one and want to keep as many of my Google links working as possible. Any advice appreciated!

    • Hi Grant,

      Thanks for the comment. There’s a couple of good questions there.

      Firstly, the estimated number of results (i.e. “About 802 Results”) is exactly that – an estimate. There have been numerous discussions surrounding this over the years from both industry experts and Google themselves. I would recommend watching this video from Google’s Matt Cutts in 2010 as it provides a bit more information – https://www.youtube.com/watch?v=2ix3mHeL7hg.

      Google may also opt not to show all indexed URLs, often because similar results are contained within the supplemental index (often masked by the “In order to show you the most relevant results, we have omitted some entries very similar to the XX already displayed. If you like, you can repeat the search with the omitted results included.” message) or because Google just don’t want to show all results!

      For an accurate indication of the number of pages indexed I would use Google Search Console; however it looks like you’re viewing the wrong into here Grant…

      The ‘Internal Links Report’ which you viewed (Google Search Console > Search Traffic > Internal Links) provides information on your internal linking structure – not the number of indexed URLs. Here’s Google’s information on the ‘Internal Links Report’

      “The Internal Links page lists a sample of pages on your site that have incoming links from other internal pages.”

      You can read more here – https://support.google.com/webmasters/answer/138752?hl=en.

      What you need is the ‘Index Status Report’ (Google Search Console > Google Index > Index Status). This will provide a more accurate representation of the number of pages that Google has indexed for your domain (subject to the domain verified).

      “Shows the total URLs available to appear in search results, along with other URLs Google might discover by other means.”

      Again you can read more on this here – https://support.google.com/webmasters/answer/2642366?hl=en

      However, whilst the ‘Index Status Report’ will provide an indication of the number of URLs indexed, Google will (rather stupidly) not provide a list of those URLs – hence why I built this tool to help πŸ™‚

      Another way to get a list of URL, especially if you’re migrating websites, is to use the data available via Google Analytics. Website migration is a complex process, so if you need any help give me a shout.

      Let me know if you have any other questions.



  • Great tool. is it possible to add export .csv file containing title and url same showing on the page of extraction? So that it becomes easy to extract urls

    • Hi Rajesh,

      I can certainly look at creating a CSV download option in the future. For the time being though, you can just copy and paste to Excel πŸ™‚


      • Thanks chris. If you will create that button, you are going to save my lots and lots of time. As i do search more than 500 keywords using your tool. Atleast it will save my 4 or 5 hours a days. Thanks. πŸ™‚ πŸ™‚

Leave a Reply

Your email address will not be published. Required fields are marked *