Happy anniversary to me: I’ve now been writing this here weekly column for exactly three years. Over the last year I have opined, prescribed, and predicted many things. And now, like last year, and the year before, as part of my one-man crusade for greater opinion-journalism accountability, I’m going to take a moment to go back and look at what I got right… and where I went horribly, hilariously wrong.
OK, let’s start with my primary theme this year: technology and jobs. I actually asked the key question in a post two years ago, entitled “What If Technology Is Destroying Jobs Faster Than It’s Creating Them?“
This year, though, I expanded on that at considerable length with “America Has Hit ‘Peak Jobs’,” “Get Ready To Lose Your Job,” “After Your Job Is Gone,” (sense a theme here?) “The Future Of Work,” “Jobs, Robots, Capitalism, Inequality, And You,” and “Meet The New Serfs, Same As The Old Serfs.”
Whew. Tired of it yet? I can’t blame you. But I keep hammering at it because I believe this is one of the most important issues of our time. The notion that technology destroys more jobs than it creates has slowly become mainstream over the last few years: witness this recent piece in The Economist. My take on it, however, is different: I think all these job losses are good, in the long run, because we are (hopefully) at the very edge of a long, slow, decades-long trend towards zero jobs, i.e. a post-scarcity society. The trouble is, our current economic structure is built on those building blocks called “jobs” — and as their number slowly withers away, the necessary transition to a new system will be extremely painful and wrenching for a very large number of people worldwide.
OK. Back to the recap. Way, way back, to my very first TechCrunch post, which began: “Oh, Research In Motion. You never miss an opportunity to miss an opportunity.” Prophet score: A++. And while that may seem like fish in a barrel now, back then, believe it or not, it was fairly controversial.
I also wrote a couple of pieces about Bitcoin back in 2011, saying: “Does Bitcoin have a long-term future? I strongly doubt it…but I expect that something like Bitcoin eventually will.” Which, I note, is a whole lot like what Naval Ravikant wrote this July: “It’s better to think about Bitcoin the protocol as Bitcoin 1.0, destined to evolve.” I followed that up this year with the suggestion that a Bitcoin-like currency would eventually go mainstream in the developing world, not the rich world. Here in the medium term, of course, Bitcoin is bubbling along nicely…
I wrote about the NSA and the incipient panopticon back in 2011, too, saying: “surely there are better ways to catch these morons than building a vastly expensive and dehumanizing panopticon surveillance state.” Quite happy to stand by that one, too. I’ve been writing a lot about surveillance ever since, though I’ve actually ramped down since it became well-trodden media ground thanks to Edward Snowden.
I complained about 3D printers twice. Well, I wrote that they’ll become amazing world-changing technology, at the enterprise level — but I wrote those in articles entitled “There Is No Reason For Any Individual To Have A 3D Printer In Their Home,” and “3D Printers Are Not Like 2D Printers: A Rant.” This remains a contrarian view, but I’m happy to stand by it.
I also claimed “All Journalism Is Tech Journalism Now,” which was admittedly a little aggressive, but I still think we’re heading that way. Same for “The Technical Interview Is Dead,” and “Prepare To Pay For Your Privacy.” (See also.)
Bored with my self-congratulation? Let’s move on to where I messed up. Six months ago I proudly proclaimed “The Time Has Come For Chrome In The Home,” a celebration of Google’s ChromeOS. It’s early days yet, but I think I got that one wrong. I’ve hardly touched my own Chromebook since I wrote that post. Meanwhile, ChromeOS isn’t actually that much more capable than a tablet — and tablet prices are dropping faster.
Last year I wrote “I Believe In Google Plus,” but this year my view evolved into to, “Google Plus Is Like Frankenstein’s Monster.” Which is actually mostly a compliment, as those of you who have read the original novel know; the monster was brilliant, urbane, civilized…but spurned by the world. Alas, that seems true of G+ too.
And then there’s Foursquare. I complained about them in my second ever TC post and in “Check In, Flame Out: How To Save Foursquare,” where I described it as a “long-term loser.” On reflection, that was probably too harsh; I think they’ll probably keep eking out an existence on the perpetual edge of mainstream relevance.
Last year, I wrote “Whither, Hollywood, Whither?“, in which I wrote “it seems to me that the predatory price-gouging Internet is more dangerous to movies than television,” and this year I followed it up with “When Will Doom Come To Hollywood?” where I, er, revised my opinion somewhat.
The prediction I’m personally most interested in, though, is one I made last year: “In Five Years, Most Africans Will Have Smartphones,” which I followed up this year with “The Second Billion Smartphone Users.” We won’t know until 2017 whether I’m right on that one — but there are claims that smartphone penetration has already risen to 21 percent in the Middle East and Africa, up from 1.3 percent in 2009. If that’s true, then 50 percent by 2017 looks downright conservative.
So: while I wouldn’t bet all your bitcoins on every word I type, that’s a decent performance nonetheless, if I do say so myself. And I hereby resolve to be a little bolder over the next 12 months with my predictions — because if nothing else, it ought to make next year’s iteration of this post awfully entertaining.