Heady stuff.

Last week we talked about how some researchers and scientists on the cutting edge are devising guidelines to attempt to ensure that potentially transformative technologies (like AI) remain safe and beneficial, rather than becoming a threat to humanity. And then, there were industries (like nanotech) that have already blown past any attempt at a meaningful review and now exist in thousands of consumer products, and nobody knows if their safe and the companies who produce them don’t even have to tell us they are part of the composition.

This week I’m going to talk about why I look askance at transformative technologies. Maybe it is because I am a writer at heart. Fiction, specifically science fiction, has captured my attention since childhood. It is my genre of choice. Now that nearly all of the science-based science fiction is no longer fiction, our tendency is to think that the only thing left to do is react or adapt. I can understand this since you can’t isolate a single technology as a thing, you can’t identify precisely from where it started, or how it morphed into what it is. Technologies converge, and they become systems, and systems are dauntingly complex. As humans, we create things that become systems. Even in non-digital times, the railroad ushered in a vastly complex system so much so that we had to invent other things just to deal with it, like a clock. What good was a train if it wasn’t on time? And what good was your time if it wasn’t the same as my time?

Fast forward. Does the clock have any behavioral effect in your life?

My oft-quoted scholars at ASU, Allenby, and Sarewitz see things like trains as level one technologies. They spawn systems in the level two realm that are often far more intricate than figuring out how to get this train contraption to run on rails across the United States.

So the nature of convergence and the resulting complexity of systems is one reason for my wariness of transformative tech.Especially now, that we are building things and we don’t understand how they work. We are inventing things that don’t need us to teach them, and that means that we can’t be sure what they are learning or how. If we can barely understand the complexity of the system that has grown up around the airline industry (which we at one time inherently grasped), how are we going to understand systems that spring up around these inventions that, at the core, we know what they do, but don’t know how?

The second reason is human nature. Your basic web dictionary defines the sociology of human nature as: “[…]the character of human conduct, generally regarded as produced by living in primary groups.” Appreciating things like love and compassion, music and art, consciousness, thought, languages and memory are characteristics of human nature. So are evil and vice, violence and hatred, the quest for power and greed. The latter have a tendency to undermine our inventions for good. Sometimes they are our downfall.

With history as our teacher, if we go blindly forward paying little attention to reason one, the complexity of systems, or reason two, the potential for bad actors, or both, that does not bode well.

I’ve been rambling a bit, so I have to wrap this up. I’ve taken a long way around to say that if you are among those who look at all this tech, and the unimaginable scope of the systems we have created and that the only thing left to do is react or adapt, that this is not the case.

While I can see the dark cloud behind every silver lining, it enables me to bring an umbrella on occasion.

Paying attention to the seemingly benign and insisting on a meaningful review of that which we don’t fully understand is the first step. It may seem as though it will be easier to adapt, but I don’t think so.

I guess that’s the reason behind this blog, behind my graphic novel, and my ongoing research and activism through design fiction. If you’re not paying attention, then I’ll remind you.

Bookmark and Share