Once upon a time, long ago, there were computers, and they were very large. Warehouse large. Multi-story large. Tractor-trailer large. So large that in those mythical days of yore, now lost in the mists of time, men were real men(1), women were real women(2), and bugs were actual physical bugs(3), to paraphrase Douglas Adams.
Then a terrible but brilliant man named William Shockley went and invented the transistor(4) and recruited America’s best and brightest(5) to come with him to the faraway West, into a valley south of San Francisco which at the time was a rural location with no long-distance phone service, best known for being especially fertile agricultural land.
There they did his evil bidding(6), viz., use this newfangled transistor malarkey, rather than the traditional vacuum tubes that God clearly intended for us to use(7), to shrink computers down until they became smaller. (Also: faster, more powerful, more efficient, etc.) But eight of America’s best and brightest soon noticed that Shockley was evil(8) and fled from his domain to found their own. These rebels have since been known as the Traitorous Eight(9).
The Traitorous Eight built companies like Intel, where, eventually, was birthed the first microprocessor. Their progeny been building ever smaller and more powerful microprocessors since. On the foundations laid by Shockley and the Traitorous Eight were built Silicon Valley, and the global computer industry, and, well, basically, the entire world as we know it. (Care to guess how many transistors are within a city block of you right now? No, guess again. No, larger. No, no, no. Much larger.)
All of which is wonderful, of course. But one side effect is that, to most people, microprocessors have essentially become magic. Which is fair enough, because in many ways they are magical — but it makes me uneasy. Perhaps because I have an electrical engineering degree, I have this quaint, old-fashioned notion that people should understand that technology is not magic, it’s just clever engineering full of compromises and trade-offs, with real quirks and restrictions.
(For one thing, it would make people less susceptible to accepting the unsubstantiated promises of “big data” and “machine learning” and the like. Which in many contexts are very real and extremely useful things, of course—but there seems to me to be a worrying trend to treat them as magical secret sauce that can be sprinkled on anything to make its algorithms smarter and more insightful.)
And so I give you, with only a little further ado, a venture after mine own engineering heart, one whose clones should be in every university in the world. Needless to say it is instead an obscure passion project in the midst of entirely unfunded development by a single British boffin, in the exceedingly magnificent tradition of such.
To quote from the project’s home page:
What? — The Mega-processor is a micro-processor built large. Very large.
How? — Like all modern processors the Mega-processor is built from transistors. It’s just that instead of using teeny-weeny ones integrated on a silicon chip it uses discrete individual ones like those below. Thousands of them. And loads of LEDs.
Why? – short answer: Because I want to.
Why? — long answer: Computers are quite opaque, looking at them it’s impossible to see how they work. What I would like to do is get inside and see what’s going on. Trouble is we can’t shrink down small enough to walk inside a silicon chip. But we can go the other way; we can build the thing big enough that we can walk inside it. Not only that we can also put LEDs on everything so we can actually SEE the data moving and the logic happening. It’s going to be great.
Someone please get this man a big budget and a slew of awards, stat.
(1) Not really.
(2) Not really.
(3) Yes, really. Well, OK, not exactly.
(4) An oversimplification. It was collaborative. But Shockley played a key role, and shared a Nobel Prize for it.
(5) Not all of them.
(6) Not really. Though he was a terrible man.
(7) Not really.
(8) Or at least a complete jerk.
(9) Yes, really.