Strong Theory of Artificial Stupidity
I am elated. I was sitting in class on the first day of my first college course on artificial intelligence (AI). I couldn't believe my luck. I can get college credit studying something that seems like sci-fi. I sat there with my mind wide open.
I learned about logic, Alan Turing, and Searle's Chinese box. As a Physics student, this was just an elective. I ate it all up. I secretly hoped that the Strong Theory of AI was true -- that we could someday build a sentient artificial intelligence like our own, only better.
In my reading I came across a magical idea to which I am still drawn. It is an idea to which many ascribe, and few try to explain -- the emergence of strong AI. The idea is this: if you have a sufficient number of computational units, each unintelligent on their own, and you connect them in the correct manner, strong AI will emerge. Now that I also study complex systems, I find this thought even more compelling.
Take any sufficiently large set of stupid components and combine them => intelligence emerges. Wow! I hate to say it, but it sounds like a god of the gaps argument. Even so, it's an attractive idea.
Fast-forward a few years. I was working at a company with thousands of employees and offices all over the world. I was struck on a daily basis by the extreme contrast between big-company culture and small-company culture. I had worked in several startups. This was my first large organization.
In startups people do what they gotta do -- to water the seeds of the company's future. Everybody helps out with almost everything. You overhear the phrase "many hats" way too much. It's a lot of fun. It's exciting. You never really know what will happen next. Will we get bought by Google? Will we go out of business next week? Will we entirely change our technological focus? People in small companies are forced to stay mentally nimble -- to be ready for anything. You have to keep your wits about you.
In large companies people do what they gotta do -- period. The cliché is that the larger a company becomes, the more its employees walk around saying, "That's not my job!" In my opinion, this cliché is true. As George Lucas once said, "Don't avoid the cliches - they are cliches because they work!"
What causes this? What causes otherwise capable multi-talented people to eschew all duties not matching their precise job description? Is this the same process that led to specialization in early civilization? Is each employee's sense of ownership diluted with each new hire? Perhaps each person knows implicitly that their impact on a company's destiny is smaller, thus minimizing a key workplace motivator (see this talk by Dan Ariely ). In my own experience, I have found that large companies full of smart people are able to make profoundly stupid decisions. I would love to hear some of your stories in the comments section!
One might call this the Theory of Strong Artificial Stupidity:
Take any sufficiently large set of intelligent people and combine them => stupidity emerges.
Do you have a story of the stupidity of groups? Please share below!