[seo] How to request Google to re-crawl my website?

There are two options. The first (and better) one is using the Fetch as Google option in Webmaster Tools that Mike Flynn commented about. Here are detailed instructions:

  1. Go to: https://www.google.com/webmasters/tools/ and log in
  2. If you haven't already, add and verify the site with the "Add a Site" button
  3. Click on the site name for the one you want to manage
  4. Click Crawl -> Fetch as Google
  5. Optional: if you want to do a specific page only, type in the URL
  6. Click Fetch
  7. Click Submit to Index
  8. Select either "URL" or "URL and its direct links"
  9. Click OK and you're done.

With the option above, as long as every page can be reached from some link on the initial page or a page that it links to, Google should recrawl the whole thing. If you want to explicitly tell it a list of pages to crawl on the domain, you can follow the directions to submit a sitemap.

Your second (and generally slower) option is, as seanbreeden pointed out, submitting here: http://www.google.com/addurl/

Update 2019:

  1. Login to - Google Search Console
  2. Add a site and verify it with the available methods.
  3. After verification from the console, click on URL Inspection.
  4. In the Search bar on top, enter your website URL or custom URLs for inspection and enter.
  5. After Inspection, it'll show an option to Request Indexing
  6. Click on it and GoogleBot will add your website in a Queue for crawling.