This post originated from an RSS feed registered with Java Buzz
by dion.
Original Post: Google Web Accelerator: Gather Human Stats
Feed Title: techno.blog(Dion)
Feed URL: http://feeds.feedburner.com/dion
Feed Description: blogging about life the universe and everything tech
Google just announced the beta service: Web Accelerator. The magic behind this is that the Google Grid is now a proxy itself.
It is even a smart proxy than can send diffs rather than full pages, and can actually gzip the content.
The most interesting part of this how thing, is how Google can use the data. I am not talking about the privacy folks jumping up and down warning you not to use it, rather it can be used for GOOD.
One problem with Google crawling around the web is that the GoogleBot is a bot. It isn't human.
Now, if enough people use this service, they will be able to work out the important links. If a popular page such as Slashdot put hidden links, Google wouldn't be fooled as NO humans would be following those links.
This can be huge, and can add to Trust Rank by grokking the human element. Interesting stuff. I am a little sceptical to see how much the tool speeds things up though, but I will give it a shot :)