Sunday, September 23, 2007

The human-computer algorithm

The New York Times has published this ode to the algorithm. They write:
Algorithms, as closely guarded as state secrets, buy and sell stocks and mortgage-backed securities, sometimes with a dispassionate zeal that crashes markets. Algorithms promise to find the news that fits you, and even your perfect mate. You can’t visit without being confronted with a list of books and other products that the Great Algoritmi recommends.

The article is in a section titled "Artificial Intelligence - Computers and the Internet", but the focus of the article is the trend of algorithmically harnessing human intelligence -- the goal of Luis Von Ahn. They mention his Google Image Labeller, which is an explicit algorithm with human subroutines. But perhaps more interestingly, they describe Wikipedia as a human algorithm --

A constantly buzzing mechanism with replaceable human parts. Submit an article or change one and a swarm of warm- and sometimes hot-blooded proofreading routines go to work making corrections and corrections to the corrections.

Generally Wikipedia is thought of as an entirely human project, but is it a giant distributed fault-tolerant algorithm? Why not? I think that it is good that algorithms should be thought of as ideas, rather than software code. Algorithms can exist independently of computer hardware, as entities (worthy of study) entirely unto themselves. After all, we don't think of algebraic fields as objects whose existence is inherently tied to paper or chalkboards. Why should algorithms exist only in silicon?

For other takes on this article, see here.

No comments: