There are two options. The first (and better) one is using the Fetch as Google option in Webmaster Tools that Mike Flynn commented about. Here are detailed instructions:
- Go to: https://www.google.com/webmasters/tools/ and log in
- If you haven't already, add and verify the site with the "Add a Site" button
- Click on the site name for the one you want to manage
- Click Crawl -> Fetch as Google
- Optional: if you want to do a specific page only, type in the URL
- Click Fetch
- Click Submit to Index
- Select either "URL" or "URL and its direct links"
- Click OK and you're done.
With the option above, as long as every page can be reached from some link on the initial page or a page that it links to, Google should recrawl the whole thing. If you want to explicitly tell it a list of pages to crawl on the domain, you can follow the directions to submit a sitemap.
Your second (and generally slower) option is, as seanbreeden pointed out, submitting here: http://www.google.com/addurl/
Update 2019:
- Login to - Google Search Console
- Add a site and verify it with the available methods.
- After verification from the console, click on URL Inspection.
- In the Search bar on top, enter your website URL or custom URLs for inspection and enter.
- After Inspection, it'll show an option to Request Indexing
- Click on it and GoogleBot will add your website in a Queue for crawling.