Google algorithms, security by obscurity?
So it is an oft cited security maxim that security by obscurity doesn’t work. That is, if a system is secure only by its reliance on keeping its details secret – it isn’t secure. People are very good at finding out secrets and the obscurity only prevents others from finding and pointing out weaknesses.
I just read this article which talks about Jimmy Wales (at Wikia) developing a new search engine to compete with Google. He brings up a super interesting point that, basically, Google’s search is a black box – you don’t know what they are doing. This is, security (of a sort) by obscurity – and people are routinely gaming that system. Black hat SEO’ers have very good ideas about how the system work and use it to their advantage.
Can an open algorithm do better? Can it, through the eyes of the public, come up with a system that is genuinely fair and secure from gaming? Search ranking seems like a genuinely difficult problem that few would be capable of really digging into. A significant portion of those few, I suspect, are employed by Google and Yahoo. One big advantage that Wikia has is that they’re just starting, I suspect that Google is saddled with legacy architecture and that even if they wanted to they couldn’t change drastically change the way they index and rank pages. Take a look at how long it took Yahoo to upgrade their systems to Panama. Wikia (and any other new search engines) can stand on the shoulders of Google and start fresh with all the learning and experience that Google has already gone through.
I’m not sure how well this will turn out but I’m quite interested to watch and see what happens.
Interestingly enough, I stumbled upon this article where Schneier puts the hurt down on Eric Schmidt.