Moore’s Law And The 30-Year Rule

Editor’s note: Paul Johnson is a co-founder of Uncommon Union.

Citations of Moore’s Law are growing exponentially. In fact the number of articles with some mention of the law, which has come to mean computing capacity doubles every 10 months, are accelerating. TechCrunch alone returns 220 pages of results. If you consider the comments, trackbacks, and social mentions, it is only a matter of time before the Internet is just one large recitation.

I tease. I don’t begrudge the technosphere’s fantasy of Law Giver. Readers of Plato still entertain the Philosopher King. I have heard that artists and other creative types regularly project visions of grand influence. The truth of Moore’s Law, which is hotly contested, is not important. Rather, how it has been interpolated — by means of its own interpolation — into so many other areas is what attracts my skepticism. The temptation of forecasting rapid growth is enormous, the proliferation of technology is profound, and anticipating great things is well-intentioned.

There is an interesting counterpoint to hold in one’s mind when participating in an industry of grown-up cyberpunks jacking into the infinite. This is known, in the humanities, media studies, and among cultural theorists as the “thirty year rule.” This by no means invalidates Moore’s Law, but it does reframe it.

The basic idea of this rule is that it takes time for expertise to build, knowledge to grow, and a “golden age,” which typifies a form, to emerge. There are many examples often cited from cinema to video games. Each follows a path from experimentation to the first mature form. This, of course, is not a hard science. Starting dates and criteria will differ.

Thirty-year types are not merely floating in Moore’s wake as a sort of clean-up crew, slowly aestheticizing the flotsam. They are the landmarks against which one navigates; they right the ship. This is a profoundly interdependent relationship. It’s about collaboration. Where would the tech world be without culture, without Star Trek or Star Wars, for example, the irrational fantasies that inspire exploration? Where would the culture be without tools?

There are many kinds of collaboration. Business collaboration is, of course, very different than scientific collaboration. One might even consider social shopping a collaborative experience. The Internet has always been thought of as collaborative — sometimes veering into utopian, other times suspicious. Although the legacy is impressive, we also see liabilities: security, privacy, economic. There’s more to collaboration than throwing data into a big bucket. Maybe this is obvious, but that’s my point.

Before Ebola dominated the headlines, you might remember Obama’s remarks at the U.S.-Africa Leaders Summit. The rapid expansion of mobile technologies throughout Africa has fueled speculation of the “Leapfrog Effect,” which is a sort of reworking of Moore’s Law. Although, the idea of Africa skipping several stages of development is hopeful, it risks reducing economic policy to a grand bargain among telecoms. It smacks of a land grab.  For something closer to home you might consider this:

Collaboration, unfortunately, is not a technological inevitability. It may require more of a 30-year perspective as opposed to an annual horizon. Moore’s Law doesn’t take into account the evolution of applications surrounding a technical advance; it’s not meant to.

In any event, the 30-year rule is groping for something even more elusive than filling out the capacity. It is not just about loading YouTube’s servers with content; it is about maturity, expression, mastery and agency. We need to shoot for nothing less than a collaborative golden age, rather than a 10-month flip.