The Huffington Post just published my oped on Facebook. Give it a read and if you agree, please consider sharing it with all your “friends.”
The larger concern underlying this piece, an obsession of mine for decades, is a familiar asymmetry: between the seemingly limitless human capacity to create and innovate, and our less impressive track record at managing or controlling our creations. The classic example is the atom bomb: triumph of will, and existential threat.
If only everyone would read Richard Rhodes’ sublime history, The Making of the Atomic Bomb. Perhaps the best technology book ever written. In its sequel, Dark Sun: The Making of the Hydrogen Bomb, Rhodes concludes that what can be built will be be built. So our challenge, always, is what to do once the cat is out of the bag. Significantly, the makers of the atomic bomb generally regretted having helped create it. Similarly, in the Facebook piece mentioned above, three contributors to the creation of Facebook now express second thoughts: Sean Parker, Roger McNamee, and Chamath Palihapitiya. To date, the only Silicon Valley people I’m aware of who’ve read Rhodes’ atom-bomb book are Marco Zappacosta, Sam Altman, and Ben Rosen.
During my adulthood, digital technologies — from microprocessors and cell-phone networks to big data, cloud computing, AI, and yes, social networks — have combined to create challenges as threatening, in my view, as atomic weaponry.
What could go wrong? For starters, how about AI developed by the knuckleheads who invented social networks? Or read this Wall Street Journal piece about the leading edge of state surveillance.
The Facebook piece describes one particular issue that’s pertinent right now: social-media business models that threaten civil society. They won’t kill us, but the situation is bad. Read the article, look ahead five or ten years, then reflect: Do the creators of our future know what the heck they’re doing? Decide for yourself.
It’s hard for us, collectively, to think well about phenomena that develop slowly and then suddenly explode into wildfire. Commercial internet technology is one example: It’s been around for 20 years. Everyone knows about it. We imagine our thinking is up to date, yet we haven’t caught up to the implications. In truth, we’ve hardly begun to respond.
Concerning the cognitive aspect of the topic, you might want to read Kathryn Schulz’s lively Being Wrong: Adventures in the Margin of Error. It starts with the observation that human beings literally are incapable, in the present moment, of recognizing that they are wrong. We’re hard-wired that way. We might see that we were wrong a minute ago, but right now, all of us always believe we’re right. We should be asking ourselves — seriously — whether anyone is right, about anything, right now.
When long-cycle threats develop at continuously accelerating rates, our species’ peerless adaptive skills can fail us. The atom bomb was a long time coming. It arrived suddenly, with a a couple of very public bangs that provoked worldwide reaction, yet we’re still scared today about Kim Jong-un and Donald Trump blowing us up. By contrast, the internet kind of crept up on us. Instead of reacting, we just got used to it. And with some threats, for instance climate change, we may go for denial rather than action.
Given that the fruits of human inventiveness increasingly present long-cycle threats developing at continuously accelerating rates, I wonder whether we can keep pace. If not, homo sapiens may pass its sell-by date.
Hoping I’m wrong. Happy holidays.
Facebook Can’t Be Fixed, by John Battle
How to Fix Facebook — Before It Fixes Us, by Roger McNamee
Zuckerberg’s 2018 personal challenge (January 3, 2018)
Photo credit: Bloomberg via Getty Images