I've launched a lot of new websites over the years and I've experimented with many different methods of getting websites indexed. Here are my top five indexing tips...
1) Submit your website URL to Google
This is a bit of a no-brainer but it really has to be first in this list. Google can’t index your website if it doesn’t know about it, so submitting your homepage URL is definitely the best place to start. This may not be the quickest indexing method but it’s a safe and sure one to start you off.Of course Google makes no guarantees that it will index every URL that is added but don’t let this stop you. Just make sure you only add one URL per domain (normally your homepage) otherwise it could be considered spam. Google’s crawlers will find all the other pages in your website by following internal links.
- Here’s the link to submit your URL to Google:
www.google.com/addurl
- Yahoo also has a similar ‘Add URL’ function:
search.yahoo.com/info/submit.html
- And so does Microsoft Live:
search.live.com/docs/submit.aspx
2) Add an XML sitemap
It can take quite some time for Google’s crawlers to index all the pages in a new website just by following links. The larger the website, the more time it can take. Pages at a high click depth from your homepage can take a lot longer to get indexed because the crawlers don’t discover them until after several rounds of indexing and link following have occurred. I find that adding an XML sitemap really solves this problem because it tells Google about all your pages ahead of time. If you have a large website with many high click depth pages then an XML sitemap will help indexing enormously.An XML sitemap is basically a text file (saved with an XML extension) that lists all the URLs in your website. The XML sitemap protocol is very simple so it can easily be created by hand or automatically with an XML sitemap generator tool. The XML sitemap standard is supported by Google, Yahoo! and Microsoft so the same sitemap can be used for all three search engines.
Once you have created your sitemap file you have to submit it to each search engine. To add a sitemap to Google you must first register your website with Google Webmaster Tools. This website is well worth the effort, it’s completely free plus it’s loaded with invaluable information about your website ranking and indexing in Google. You’ll also find many useful reports including keyword rankings and health checks. I highly recommend it.
Similarly, you can add an XML sitemap to Yahoo! through the Yahoo! Site Explorer feature. Like Google, you have to authorise your domain before you can add the sitemap file, but once you are registered you have access to a lot of useful information about your site.
Microsoft doesn’t have a special way to submit a sitemap yet (they always seem to be so far behind the competition) so you have to use their regular Submit my site form instead.
3) Submit an article to Digg
Another interesting thing I’ve noticed lately has to do with social news websites. If you submit an article to Digg or Reddit or one of the many other large social news websites, your URL tends to get picked up by Google very quickly. Typically a Digg article will appear in Google’s index after only a day or two. This is great news if you want new pages on your website to be indexed very quickly.The down side to social news submission (if you can call it a down side) is the URL only stays in Google’s index for a few days to a week before it drops out again. After this happens it seems to be crawled as per normal, eventually appearing in the index for good after a more natural timeframe. The only exception to this rule is when an article becomes extremely popular and rises to the front page of the news website - these tend to stay in the index and not drop out at all.
Social news websites are a great way to get new websites indexed. If your site has lots of interesting content you could even find yourself on the front page so it’s well worth giving them a try.
4) Add a Google map
I discovered this little trick just the other day when I was helping my girlfriend build her big doodles website. Felicity's always drawing cute little pictures, she scans them in at super-high resolution, cuts them up into tiles, and displays them on her website with the Google Maps API (It's a great way to explore massive images on a small bandwidth connection). To make the 'doodle map' work on her domain we had to first apply for a Google Maps API key. So we did this, then we played with a few test pages on the live domain - to my surprise after a couple of days her website was ranking on the first page of Google for "big doodles", I hadn't even submitted the domain to Google yet!Eventually I figured out what was happening. One of the Google Maps API conditions is the maps you create must be in the public domain (i.e. not behind a login screen). So as an extension of this, it seems that pages (or domains) that use the Google Maps API are crawled and made public. Very neat!
No comments:
Post a Comment