Very long ago I heard a tale (perhaps from Google itself) about how bad (and dangerous from SEO / SE banning point of view) it is to use extremely low contrast text, i.e. bright yellow text on white background or dark-brown on black. It went like you could be banned from the search engine index immediately for that kind of stunt.
It is pretty clear – by this deceptive tactic you could hide irrelevant keywords from human visitors and present them with some unwanted content of your choice. At the same time those keywords presumably are indexed by search engines making your pages findable by wider audience.
Now, how do search engine spiders tell if the text is poorly visible or even completely invisible on the site’s page?
Before CSS it was easy to determine by looking at the HTML tags surrounding text. In CSS age an SE bot definitely needs to read the stylesheet stored in a file separate from the page being analyzed.
However, a quick grep thru the last two months’ logs for STEREO
grep css < access.log | grep -i googlebot
... revealed NO interest whatsoever from googlebot in .css!
So how does Google make the trick? Or is it a green light to crooks again?