Or why the heck is Google bot refusing to look at my site, when everyone else browses through it just fine
First things first -- this is just something that happened to me... today. On a clients website. On the day it went live. And it ...drove ...me ... MAD! You see, the website was working just fine. It actually had everything required, and was (what I thought to be) as SEO as I could make it. But I never anticipated the way human errors occur.
A bit of background first : the site had a couple of people responsible for writing content, and a senior editor for correcting, and publishing them. The frontpage was a single node, which had all the node content hidden via node.tpl.php ( really a bad idea ), and populated with blocks to achieve the necessary result.
The senior editor, not knowing what that page thing was ( who could imagine that a page titled "Home" is your homepage?), unpublished it. Thus everyone who visited the site, got a 403, which was not displayed since it was rendered in the node content, which was hidden. Google bot ( and most other spiders ) do not crawl a page if the response is anything but 200, even if it is stuffed with other content.
Thus, everyone who visited could see the frontpage, but the client could not add it to Webmaster tools. After a couple of hours of looking though it.... checking a simple checkbox solved it. Just so you know, and not repeat the same mistake :)
( Article photo by gui.tavares)