Tag Archives: Evan Selinger

Watching and listening.

 

Pay no attention to Alexa, she’s an AI.

There was a flurry of reports from dozens of news sources (including CNN) last week that an Amazon Echo, (Alexa), called the police during a New Mexico incident of domestic violence. The alleged call began a SWAT standoff, and the victim’s boyfriend was eventually arrested. Interesting story, but after a fact-check, that could not be what happened. Several sources including the New York Times and WIRED debunked the story with details on how Alexa calling 911 is technologically impossible, at least for now. And although the Bernalillo, New Mexico County Sheriff’s Department swears to it, according to WIRED,

“Someone called the police that day. It just wasn’t Alexa..”

Even Amazon agrees from a spokesperson email,

“The receiving end would also need to have an Echo device or the Alexa app connected to Wi-Fi or mobile data, and they would need to have Alexa calling/messaging set up,”1

So it didn’t happen, but most agree, while it may be technologically impossible today, it probably won’t be for very long. The provocative side of the WIRED article proposed this thought:

“The Bernalillo County incident almost certainly had nothing to do with Alexa. But it presents an opportunity to think about issues and abilities that will become real sooner than you might think.”

On the upside, some see benefits from the ability of Alexa to intervene in a domestic dispute that could turn lethal, but they fear something called “false positives.” Could an off handed comment prompt Alexa to make a call to the police? And if it did would you feel as though Alexa had overstepped her bounds?

Others see the potential in suicide prevention. Alexa could calm you down or make suggestions for ways to move beyond the urge to die.

But as we contemplate opening this door, we need to acknowledge that we’re letting these devices listen to us 24/7 and giving them the permission to make decisions on our behalf whether we want them to or not. The WIRED article also included a comment from Evan Selinger of RIT (whom I’ve quoted before).

“Cyberservants will exhibit mission creep over time. They’ll take on more and more functions. And they’ll habituate us to become increasingly comfortable with always-on environments listening to our intimate spaces.”

These technologies start out as warm and fuzzy (see the video below) but as they become part of our lives, they can change us and not always for the good. This idea is something I contemplated a couple of years ago with my Ubiquitous Surveillance future. In this case, the invasion was not as a listening device but with a camera (already part of Amazon’s Echo Look). You can check that out and do your own provocation by visiting the link.

I’m glad that there are people like Susan Liautaud (who I wrote about last week) and Evan Selinger who are thinking about the effects of technology on society, but I still fear most of us take the stance of Dan Reidenberg, who is also quoted in the WIRED piece.

“‘I don’t think we can avoid this. This is where it is going to go. It is really about us adapting to that,” he says.’”

 

Nonsense! That’s like getting in the car with a drunk driver and then doing your best to adapt. Nobody is putting a gun to your head to get into the car. There are decisions to be made here, and they don’t have to be made after the technology has created seemingly insurmountable problems or intrusions in our lives. The companies that make them should be having these discussions now, and we should be invited to share our opinions.

What do you think?

 

  1. http://wccftech.com/alexa-echo-calling-911/
Bookmark and Share

What does it mean to be human?

Earlier this week, just a couple of days after last weeks blog on robophobia, the MIT Technology Review (online) published an interview with AI futurist Martine Rothblatt. In a nutshell Ms. Rothblatt believes that conscious machines are inevitable, that evolution is no longer a theory but reality, that treating virtual beings differently than humans is tantamount to black slavery in the 19th century, and that the FDA should monitor and approve whatever hardware or software “effectively creates human consciousness.” Her core premise is something that I have covered in the blog before, and while I could spend the next few paragraphs debating some of these questionable assertions, it seems to me more interesting to ponder the fact that this discussion is going on at all.

I can find one point, that artificial consciousness is more or less inevitable, on which I agree with Rothblatt. What the article underscores is the inevitability that, “technology moves faster than politics, moves faster than policy, and often faster than ethics”1. Scarier yet is the idea that the FDA, (the people who approved bovine growth hormone) would be in charge of determining the effective states of consciousness.

All of this points to the fact that technology and science are on the cusp of a few hundred potentially life changing breakthroughs and there are days when, aside from Martine Rothblatt, no one seems to be paying attention. We need more minds and more disciplines in the discussion now so that as Rothblatt says, we don’t “…spend hundreds of years trying to dig ourselves out.” It’s that, or this will be just another example of the folly of our shortsightedness.

1.Wood, David. “The Naked Future — A World That Anticipates Your Every Move.” YouTube. YouTube, 15 Dec. 2013. Web. 13 Mar. 2014.

Bookmark and Share

The Naked Future. Are you ready?

Ed. note: Due to problems with my ISP, The Lightstream Chronicles was posted late this morning. Perhaps the subject of a future blog rant, after hours of something loosely called “tech support”,  I had to drive to the local Starbucks to upload the pages. Long live Starbucks!

p66

If you zip back to my blog about page 53 you’ll see a somewhat lengthy but not all that coherent post on the interaction between humans and synthetics. That post centers more on how synths, once they became realistically human, were quickly exploited as slaves, both menial and sexual. Though not all of the future society in The Lightstream Chronicles was to blame as soon as there was a device that could do your bidding, there were those who abused the technology. Some will see this is pure dystopic fiction but it is difficult to argue that the past is littered with the precedent for technological misuse. And as we move toward a more ethically relativistic society, misuse will have a narrower and narrower definition. Therefore, even in a society that should be more enlightened, it is completely plausible that we could treat our synthetic co-workers with less respect than real humans. The irony in this future speculation is that the technological enhancement of humans and their symbiotic fusion with the technosphere, along with the ever more emotional and empathic capabilities of synthetics, the line between real humanity is almost nonexistent.

The Naked Future

Thinking about the future is more than a geeky, sci-fi pastime. I believe it is our responsibility to engage with the political, scientific, social and ethical decision-making happening around us. Because, whether we know it or not, those decisions will make a huge impact on the shape of the world we live in tomorrow. It’s just one of the reasons that I am a card-carrying member of The World Future Society. As a member, I regularly check in with wfs.org to see read the latest prognostications on the future. If you look closely at the predictions or forecasts of any futurist, it’s possible to see where they are coming from as well. In other words, everyone comes at his or her vision of the future with an opinion: Is this aspect of the future all positive or is there a cautionary tone?

This is, of course, at the core of my design fiction research at Ohio State. So, as I was meandering around the wfs.org site I stumbled upon an article by Patrick Tucker, an editor at The Futurist magazine, a publication of WFS. This happened on March 5th. Coincidentally, I saw that Patrick’s book, The Naked Future: What Happens In A World That Anticipates Your Every Move? was about to be released on March 6th. Since this topic is dead center on my radar, I clicked over to iTunes to see if it was available as an iBook, and sure enough, it was. Nevertheless, I couldn’t wait so Googled up a YouTube video moderated by David Wood for the London Futurists and featuring the aforementioned Tucker along with futurists David Orban, Evan Selinger, Gray Scott, and Rachel Armstrong. It was a lively (though, at times, technically challenged) Skype meet-up that touched on some timely topics.

I hope to have a full review on Tucker’s book in a future blog but I think that the meet-up touched on some of the thought-provoking ideas that I’m sure are in-store for the reader. Naked is a perfect term for this idea of our lives being transparent and the book (though I am only partially through it) documents the shrinking evolution of big data from unwieldy complexity to smartphone accessibility — as a fearsome tool of the powerful over the weak to what is becoming an open resource. Therein is perhaps the most interesting part. We may as well accept that fact that this is a reality, and as Tucker explains (11) the big data era has already morphed into telemetry, “Telemetry is the collection and transfer of data in real time, as tough sensed.” The fact is we leave tracks. Extrapolating this is easy, walk the same path, explore some dark corner, innocently tweet and you are adding to your data. After a while, as much as you may wish to disbelieve, it is easy to predict where you will go next. As computing becomes more ubiquitous, all of our surfaces become live, as everything we touch leaves some sort of metadata fingerprint, eventually our lives will be, well, naked.

How will we deal with that? Some say to relax, that we’ll adapt to that change just like we have to every other change. I have some ideas on that, but I will save them for the next blog. Cheers.

 Tucker, Patrick The Naked Future: What Happens In A World That Anticipates Your Every Move? New York, Penquin, 2014.
Bookmark and Share