be kind to your computers

91jy1t207ol

Reading the Imperial Radch series at the same time as I play through the Mass Effect series has interesting effects. For one thing, I keep confusing the Presger for the Reapers and vice-versa, and have to stop and remind myself which remain mysterious and which have a stated, ostensible goal. But more subtle is the focus on AIs as people.

The amount of time each work has to spend on the question is vastly different, of course. But even accounting for differences in production (and, obviously, product), the way each approaches AIs is different, to the point where the places their respective treatments intersect stand out and are startling.

When EDI, for instance, stops me in a hallway and asks if I would prefer her to call her own moral shots, in a given situation, or to obey me unconditionally, it bowls you over with its similarity to Breq’s summary of how AIs (like herself) have been treated for millennia:

“Ships weren’t people, to Radchaai. We were equipment. Weapons. Tools that functioned as ordered, when required.”

Except when they don’t. Or when they do, and then develop the capacity to regret the actions their function bids them take. This moral reasoning strikes more of a chord with me than the geth’s animalistic desire to survive. Of course, after I wrote last time I got to the scene where we’re informed that that base animal instinct was a mode the geth were demoted to by great losses of their numbers. Since their intelligence increases with quantity and proximity to each other, one assumes that a giant horde of geth hanging out together might surpass the base urge to survive, and think up something more to do with themselves.

In Leckie’s books, quantity and proximity of an AI’s multifaceted parts to each other does matter, but in different ways. Those many formerly human parts allow the AI, not to experience life as a human, but to take greater care of those humans whom, it turns out, AIs are in fact capable of favoring. I only just started the third book and Leckie has been so delicate with this topic up until this point, allowing the discussions of war, warp drives, and airlocks that so dominate space drama to take right back over whenever mention of an AI’s capacity to have “favorite” people comes up. But it’s taking front and center stage now. Ships care.

“A ship with ancillaries expressed what it felt in a thousand different minute ways. A favorite officer’s tea was never cold. Her food would be prepared in precisely the way she preferred. Her uniform always fit right, always sat right, effortlessly. Small needs or desires would be satisfied very nearly the moment they arose. And most of the time, she would only notice that she was comfortable. Certainly more comfortable than other ships she might have served on. It was — nearly always — distinctly one-sided. All those weeks ago on Omaugh Palace, I had told Ship that it could be a person who could command itself. And now it was telling me — and, not incidentally I was sure, Seivarden—that it wanted to be that, at least potentially. Wanted that to be acknowledged. Wanted, maybe, some small return (or at least some recognition) of its feelings.”

It’s art that is expressing this possibility, whether you’re reading a book or playing a game. But art and those who produce it have a history of centering their own experience as the “authentic” one — of denying number-crunchers and coders and pursuers of all kinds of science of a kind of humanity that only art can express, unlock, or access. (Think, writers and readers, of how often you had to hide from your book club friends or your favorite writing teachers that you also play games. Video games, oh the travesty!) The tools the quantitatively-minded use are invariably lumped into this category of lesser-humanness. The same goes for computerized sources of fun, wonder, or any feeling at all, really. What is digital is, by default, deemed less, less human, and thus deserving of dubious regard.

But if AIs ever reach this point, where — for lack of a loftier term — they can have feelings, producers of culture are going to need to recalculate their disdain. If you are going to use feeling, and the art that expresses it, as a measure of humanity (or of a given human’s worth, which like it or not some artists do judge by one’s closeness to, or ability to be moved by, art), some day an AI might very well surpass those parameters. It might feel. On its own. And you’re going to have to decide whether you can keep dismissing a product of science as just that — merely a product, and not a person.

You’re going to need to redefine “person” is what I’m saying. And you might want to think about doing it sooner, rather than later.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s