The computing of distrust

A look at what lies ahead in the disenchanted age of postmodern computing.

By Mike Loukides
January 6, 2015
Ominous II. Ominous II. (source: James Loesch on Flickr)

Sometime last summer, I ran into the phrase “postmodern computing.” I don’t remember where, but it struck me as a powerful way to understand an important shift in the industry. What is different in the industry? How are 2014 and 2015 different from 2004 and 2005?

If we’re going to understand what “postmodern computing” means, we first have to understand “modern” computing. And to do that, we also have to understand modernism and postmodernism. After all, “modern” and “postmodern” only have meaning relative to each other; they’re both about a particular historical arc, not a single moment in time.

Learn faster. Dig deeper. See farther.

Join the O'Reilly online learning platform. Get a free trial today and find answers on the fly, or master something new and useful.

Learn more

Some years back, I was given a history of St. Barbara’s Greek Orthodox Church in New Haven, carefully annotated wherever a member of my family had played a part. One story that stood out from early in the 20th century was AHEPA: the American-Hellenic Progressive Association. The mere existence of that organization in the 1920s says more about modernism than any number of literary analyses. In AHEPA, and in many other similar societies crossing many churches and many ethnic groups, people were betting on the future. The future is going to be better than the present. We were poor dirt farmers in the Old Country; now we’re here, and we’re going to build a better future for ourselves and our children.

The period that we call “modernism,” for lack of a better word, is characterized by precisely that faith: we were building a better world, through cooperation, through self-help, and above all, through science. The future was going to be nothing like the present: we were going to have flying cars, jet packs, and robotic servants (like the Jetsons). There were corporate slogans like “better living through chemistry” (Dupont, 1935) and “making tomorrow come just a little bit sooner” (can’t find the source, but I swear I heard it on TV as a kid). Nuclear energy was going to be safe and clean, and would deliver infinite power to all of us. Wonder Bread had 12 vitamins that built strong bodies seven ways. We were going to live in a world that wasn’t constrained by limited resources. Bigger, better, faster: we might be shopkeepers or farmers, but our children certainly wouldn’t be.

By the late 60s and 70s, “modernism” was crumbling big time. We were in the process of shaking off a meaningless war; we were realizing that our notions of “progress” only extended to a small minority (not women, certainly not blacks or hispanics, and not even poor whites). We also shook off the delusion of clean power from nuclear energy: while it’s certainly possible to build a safe nuclear plant, it’s almost impossible to imagine a corporation operating the plant safely. We no longer trusted government, we no longer trusted engineering, and we were losing our faith in science.

Of course, there was no single moment when we suddenly switched off modernity and became postmodern. Bits of modernist optimism survived well into the 90s, and arguably still exist. If you were around for the beginnings of the web, you probably remember any number of articles about how the web was going to bring about global peace and understanding. About how we were building a new future for ourselves, in which there would be no discrimination, no hatred, no misunderstanding. Nobody would know (or care) that you’re a dog.

Modernist computing is the remnant (or possibly rebirth) of that early 20th century optimism that reappeared with the Internet and the web. In truth, what happened was profoundly democratizing: computers quickly went from machines of which only five were needed worldwide (IBM in the 50s) to something that nobody would ever need at home (DEC in the mid 70s), to necessities. We may be in the final years of the telephone; I wouldn’t be surprised if Skype, Hangouts, and other technologies replace the phone in the next 10 years.

But if modernism lasted longer in computing than it did in other areas, it’s coming to an end. It may be true that Twitter enabled the Arab Spring, but it’s certainly true that Twitter (and many other online forums) enabled #gamergate and the victimization of online women. It’s true that marketplaces like Etsy give small local business international reach, but poorly managed corporate security does the same for identity thieves, and has the potential to set off very non-virtual wars. We certainly like sharing our photos on Flickr and Facebook, but our connectedness has enabled a surveillance state much more thorough than anything the KGB or Stasi could have imagined during the cold war. We’re through with the naïveté of the modern, and not a bit too soon.

If we’re indeed entering the disenchanted age of postmodern computing, what does that future look like? I don’t really know, but I have a couple of thoughts.

We’re looking at the return of peer-to-peer computing, but with a new twist: the return of P2P isn’t about trust, it’s about distrust. Bitcoin has had its time in the spotlight, both in its rise and its fall; what’s important about bitcoin, though, isn’t its ability to replace other currency, but the blockchain. The blockchain is a technology for verifying transactions in the absence of trust. I’d even go further: it’s a verification technology that works even if everyone participating is someone you distrust. Why should I trust a bitcoin miner, whose primary incentive is to get rich quick? No reason at all; the genius of the blockchain is that you don’t have to. Like Fred Wilson, I expect to see many new applications based on the blockchain’s ability to verify data in the absence of trust. We already have Ethereum, Eris Industries, and Keybase.io, to name a few.

It’s hard to think about the past decade’s news without thinking of Edward Snowden and the extent to which governments (and not just the US government) have built mass surveillance systems. But what’s remarkable is that tools like Tor provide a way for citizens to take their privacy into their own hands. Historically, it was funded by DARPA as a tool for protecting US intelligence communications. But it wasn’t just another cryptographic system: the point was to mix intelligence traffic indiscriminately with other traffic, so that an observer could never tell what was what. In a point-to-point system, cryptography might prevent you from deciphering what the participants are saying, but you can see the endpoints and you can make some guesses. When everyone’s traffic is mixed together, with many concentric layers of cryptography, you can’t tell whether the traffic is from a good spy or a bad spy or a dog. While lawmakers and bureaucrats might think otherwise, preventing regular citizens, or even criminals, from taking advantage of Tor would destroy its value for the intelligence community: it depends precisely on mixing many streams of unrelated traffic. If a system is only used by spooks, you know that anyone entering or exiting it is a spook. Like the blockchain, Tor is a system designed to work in the absence of trust. At this point, Tor is difficult to use, and very easy to misconfigure, but I expect the user interface problems to be solved — if not by Tor, by some successor.

It’s hard to think about postmodern computing without also thinking about the increasingly complex and non-deterministic systems that we’re building. We used to say that a computer only did what you told it to do, and exactly what you told it to do. While that’s still true, to an extent, we’re now building systems that are massively distributed, that run on hardware that we don’t control and, in many cases, we can’t even locate. Our older model of computing — you tell the computer what to do, and if there’s a bug, it’s your fault — now strikes us as naïve, and possibly the last gasp of futuristic optimism. In the future, we will be increasingly reliant on systems that we can’t necessarily trust to do our bidding, and that fail in indeterministic ways.

Whatever the future holds, it will be impossible to hold it back. And it won’t be the modernist fantasy of the Jetsons. It might be unfortunate that the dream of progress turned out to be a fantasy, but we’re better off building for a harsh reality than pretending that our fantasies will save us. Postmodern computing is about creating the tools for that disenchanted reality.

Post topics: Emerging Tech
Post tags: Commentary
Share: