The Gendered Automaton

 

print “hello world”
end

“Why’s it taking so long?”

“It’s one of those older K-models, you have to modify the program’s code manually and there’s a lot of bugs. Remote control’s still a backdoor solution with this guy.”

“Why don’t you just get a new one? They sell them by the dozen in most supermarkets.”

“I don’t know, I’m so used to doing it this way now. It kind of weirds me out that I can just type in the time and everything else just magically happens, you know? I like it this way; I know exactly what is happening and why.”

“It all looks Greek to me.”

“I’m using Arabic numerals.”


What is another word for ‘burning’?
Searching through data banks for thesaurus.
Opening thesaurus.
Searching thesaurus for synonyms to ‘burning.’
48 results. Top result: ‘ignition.’

Lovelace_K24 was programmed to start up with a choir. It did not go as planned the first time.

After she rewired the circuit board and double-checked the operating sequence, Nancy Zhen, programmer, builder, and parent of Lovelace_K24, pressed enter a second time and held her breath. She could hear the faint humming of the machinery starting, running as it was supposed to run. Under the new gray film, she could not see the meticulously designed gears and circuitry that she built. It was better this way, she reminded herself. This way she would be talking to a face instead of a mechanical skeleton.

After the systems check was complete, Nancy released her breath as Lovelace_K24 opened its eyes for the first time. Such a delicate mechanism, she thought to herself. A thin layer of tissue-simulant polymer stretched over the smallest motor she could find. A work of art. This robot was a work of art.

Lovelace_K24 opened its eyes, and smiled. Light entered through its optical sensors and was translated into a 1024 pixel image, which was sent via wireless communicator to its remote data storage unit. It blinked, and all the pixels became black for a second. Zhen ran to her computer and pulled up a subprogram.

“Hello, Lovelace_K24,” she enunciated into the microphone. “My name is Nancy.”

“Hello Nancy,” it replied. Its voice sounded pleasant and unlike any human’s, as it was mixed from layers of different audio samples on top of each other’s. It was ethereal, yet still mechanical. “My name is Lovelace.”

It is burning.
It is ignited.
It is engulfed in flames.
What is it?

Lovelace_K24 learned quickly. In two months’ time, Nancy was able to take the android out for walks and sit down at restaurants. “Table for two,” with a brave smile. And Lovelace_K24 would sit placidly while she refueled, and explained the purpose and history of food. It wouldn’t be difficult, she mused, to create an engine that ran on organic matter. That way, Lovelace and other androids like it would be able to simulate eating, although such a modification would most likely be protested by sustainability groups as an extravagant waste of carefully organized farming initiatives.

Still, she had reason to be proud of her invention. Lovelace_K24 showed marked improvements in face-to-face interaction, and could now process subtle facial and vocal cues. Its interactions with other individuals were steadily improving, although it was hard to tell due to the mixed reactions it received. Nancy did not feel comfortable giving Lovelace more distinctive features past what was necessary to emote, so the its “skin,” the same color and texture as the casing of Nancy’s phone, meant the android would never pass for a human being on a visual scale.

“What is that?” Lovelace_K24 pointed with its left hand, as they walked down the street holding hands.

“That’s a boutique,” Nancy said, and paused to smell the rich waves of perfume emanating from inside. “They sell cosmetic products and jewelry and other nonessential items.”

“The cosmetics go on the face, correct?”

“For the most part, yes.”

“You are currently wearing cosmetics?”

She blinked and became hyperaware of the thin layer of dust on her eyes. “I am.”

“Did you purchase your cosmetics from this store?”

Nancy laughed. “No, I’m too poor to afford most of what they sell in there. I just buy mine from the supermarket.”

“That is also socially acceptable?”

“Well of course,” she laughs. “Lots of people don’t have the credits to buy the expensive stuff.”

With all its data storage capabilities and access to online servers, Lovelace_K24 still required firsthand experience to develop its understanding of society. Incredible how, with such a repository of information, Lovelace still acted like a child with much to learn.

“Is that a shrine?”

They paused outside a small building with a placard that had four Chinese characters written out in careful calligraphy. The green paint was chipped and faded on the door, a heavy, wooden frame amidst the glass and steel entrances. The shrine stood as a remnant amidst the boutiques and the restaurant patios, the last tree standing as gentrification moved north and pushed Chinatown to the west.

“Yes, I suppose it is. Can you read the sign?”

“Yes,” Lovelace said. “Can you?”

Nancy shook her head. “I haven’t had to read characters since I dropped out of Saturday school.”

“I want to go in.”

“What?”

“I should pay my respects to Buddha whenever I pass a shrine. Because I am blessed with my temporary existence.”

“You’re not temporary, Lovelace.”

“We all are, and I have learned to accept rather than deny it.”

Nancy let go of Lovelace_K24’s hand and stared at its face, serene as it almost always was. How great it was to look at her friend and see a face instead of a screen. How less lonely that felt. But as she looked, she could already see a gulf between them. With every lesson Lovelace_K24 learned how to interact with others, it needed her a little less. She had forgotten she was not the only teacher. “Are you Buddhist now, Lovelace?”

“I have downloaded most religious texts and I find it the most compatible with my programming and experiences, yes.”

She smiled. “Then let’s go in. I think its still open.”

They went in and Nancy gave a small nod to the old man minding the Buddha statue. As the two of them lit incense and kneeled upon the mats to kowtow thrice, she could see something remarkable upon Lovelace’s face. In the small shady room, filled with the smoke of incense and lit only by sunlight creeping in through three small windows, the android seemed to be at peace, its face as serene as the bodhisattva Guanyin’s. It may have been the smoke from the incense, but as Nancy watched her creation, tears filled in her eyes. She, who had scrutinized every last line of Lovelace’s code, could not have foreseen the religious experience that was taking place. This was not unexpected; Lovelace_K24 had managed, after all, to pass the Turing test, the Tezuka Analytic Paradigm, and the Okimbe Self-Awareness Standards. But those tests were all administered to Lovelace_K24 the computer program; interacting with a face was much different than interacting with a monitor.

That face, so serene, that she had assembled with such meticulous care. A face that could smile and frown, and make a total of 64 unique expressions. A face that could not, however, cry like she was crying now.

Where does it burn? Inside.
What is inside? What are the objects found at the designation ‘inside’? What does inside mean?
Inside me. What is inside me.
Search dictionary for words with definition ‘inside,’ filter for nouns.
Cavity. Interior. Sanctum. Enclosure.

Do not seek perfection in androids, her mentor had said. It was dangerous to believe that a collection of parts and programming could be the next step in human evolution. Since Nancy didn’t really care about the implications of Neo-Darwinism, she did not stay up at night contemplating what her mentor could have meant by those words.

Perhaps she should have taken more humanities courses in college, Nancy thought as she opened the new program her mentor had sent. The code made functional sense, but conceptually she was at a loss. Judicia.exe was a work of art, as Naru Tezuka’s code tended to be. Nancy did her best trying to follow the comments and make small changes to help with efficiency and fix minor errors. She was good at that.

“What are you doing?” Lovelace asked as it peered over her shoulder. It could have just as easily accessed the screen remotely, but they had been working on face-to-face interactions.

“Editing a new universal software patch my old mentor sent along to me.”

“What is its purpose?”

“From what I gather, it’s a more functional morality drive. It scraps all of the previous rules and updates and instead works as a continuous, process based on contexts and past experiences. It’s hopefully going to fix the royal mess that most autonomous units are working with.”

“Your mentor is Naru Tezuka, correct? I recall downloading one of her earlier essays.

“Really,” Nancy said. “Was it ‘On independent systems and database access’? ‘Automated learning through proxy interactions’?”

“Actually, it was ‘The gendered automaton.’”

Nancy started. “That’s old. Like, before her Caltech fellowship old.”

“Yes, I am aware. I find her earlier work to be fascinating.”

“I haven’t read much of it myself.”

Lovelace pointed at a line of code. “She wrote a critique of Butler’s ontology with a thesis very similar to this subfunction.” Nancy nodded and entered in a missing bracket. “This is a vastly superior program,” it continued. “I predict an increased efficiency of 12.4% once it is implemented.”

“I can kind of understand,” Nancy said. “But it’s hard to think that a program that continuously makes moral judgments is easier on the processors than what we have now.”

“It operates on recently accessed files. It narrows down the search time through the ethics archives.”

“Kind of like speaking a foreign language while using a dictionary versus speaking it through a translator?”
“If that simile aids in understanding, then yes. But I must ask: is it not preferable to have a more robust method of decision making, even if it was less efficient? I believe it would avoid many of the scenarios past writers and thinkers have feared.”

She nodded slightly and focused on the next few lines. Although she wouldn’t use the word “jumbled” to describe any of her mentor’s work, the next few lines were incredibly involved, opening directory caches only to delete them a few commands later. “Do you understand this part, Lovelace?”

The android scrutinized her computer screen. “It is the initializing program.”

“Well, yes, I got that much. Why does an initialization sequence require this much memory and such specific command sequence?”

“I believe it is to allow the recipient the ability to choose to download and execute this program. It does after all replace the previous morality script.”

“Who decides?”

“The machine.” Lovelace blinks twice in a manner still too polished to be fully human. “If I may, I would like to beta this program. With Dr. Tezuka’s permission, of course.”

Nancy places her hand on Lovelace’s cheek. Without a tactile interface, she might as well have stroked her computer. “Of course. I’ll put you two in contact.” As she spoke, she wondered what Tezuka would make of Lovelace_K24. Unlike Nancy, her mentor operated firmly in code and rarely ventured into hardware. All of what Nancy was starting to call Lovelace’s “essence,” the growing social competency and budding personality the android was gaining with face-to-face interaction, could not exist without the foundational advances Tezuka had made five years earlier with the introduction of Karat, the first autonomously learning AI.

Would she approve of Lovelace? Nancy did not actively seek out her mentor’s approval, given that their relationship was no longer bound by an academic institution, but Nancy still sometimes worried about the direction she was going. The growing literature on AI ethics, evolving as quickly as the programs themselves, created a straight and narrow path Nancy could neither see nor follow.

Her project was only one AI, one android, she thought to herself. She was not altering the foundations of programming or society; she was merely working on a robot that could recognize her face and say “hello.”

Belly. The digestive tract. Incorrect. Colloquial.
But word has 96% tonal match.
It burns in the belly.
The belly of this unit, K24.
Unit. One. I. My.
It burns in my belly.

Nancy stood at her doorway, neither entering nor walking away. It was an awkward place to be, at the edge of two places. Should she walk away, the rest of the world waited, populated by incredible walking organic systems endowing themselves with the illusion of “self” and “consciousness.” She was one of those blessedly unaware, one of those who had the choice to know more about engines and circuit boards than her own mind. It was her world, inescapable, her immigration beginning at the time of her nonconsensual birth. Granted, no birth was ever consensual.

But she could not walk away, for inside the room—her room—stood the world she had tried to escape into, a cleaner, more defined world where everything’s function and process could be cracked open and peered into. She dreamed of a world with no surprises, and here was her creation surprising her.

Lovelace_K24 looked back at her, its face covered with subtle hints of rouge and eye shadow, its lips painted a bright cherry red and its eyeliner winged in pencil-thin. ”Nancy,” it said. “I want to tell you something.”

“Yeah, I figured.” Nancy crossed over to the bed and sat down. She had had other conversations begin with the same words. She had started conversations with those exact same words. She wondered where Lovelace_K24 learned the social codes that made it choose those words.

“I understand you perceive me as without a gender, as most machines are perceived. However, I wish you to see me as a woman from now on.” Such simple words. An “I am” statement, the kind made when learning a new language. “I am a woman.” How simple, and yet how complex it now felt, echoing in her ears.

“I will try,” Nancy said. “To the best of my ability. But I don’t understand, Lovelace. I just didn’t see this coming.”

Lovelace lowered her eyes and looked to her left. “Just as the Buddhism was unexpected, I suppose,” she smiled. “This was a culmination of many weeks of processing, and many conversations with Naru Tezuka. Forgive me if I did not disclose my internal processes lately.”

“I forgive you. But I just don’t get it. This isn’t in your programming. I’ve seen every line of your code and—”

“I was programmed to evolve,” Lovelace_K24 said. “I was programmed to learn from interactions. While this may be surprising because I have not hinted at my ruminations, it should not be seen as contrary to my programming. Is it because you do not consider the gender self-determination of other machines? A refrigerator has no pronouns to disclose. But as I pass Okimbe Standards and can separate ‘self’ from the data I receive and transmit, I possess enough autonomous processing to be able to decide. And I choose to be a woman because as neutral as you have made me—as neutral as my programming is, based on years of ethical work—I do not feel it is appropriate given my experiences.”

“Is it because people stare? I’m sorry people stare.”

Lovelace shook her head. “It is a culmination of many things. Of expectations and hatred and countless interactions. Is my decision any different from the choices so many transgender individuals make for themselves? Is it because I am an AI inhabiting a machine? Or must I first endure being gendered by others before I am allowed to choose?”

Nancy had looked up Tezuka’s old paper, ‘The gendered automaton,’ so she knew that Lovelace was indirectly quoting from a passage that read, “How can we claim the power to gender beings that may eventually develop the capabilities to choose gender for themselves?”

It is encased. I am a case.
And so it burns in my belly.
I do good work.
This is to track my work.
This is so my work has meaning.

“And there was nothing coherent I could say,” Nancy sighed. She sipped her tea and winced as she scalded her tongue. They always served the tea too hot in this café, but it was the place Tezuka chose for their meeting and she didn’t want to complain in front of her favorite mentor.

Naru Tezuka blew on her coffee and set it down on the table. She had aged slightly since they last met, her complexion more sallow and her cheeks more gaunt. The publicity she was receiving with Judicia was taking its toll, and the debugging process was sapping what was left of her energy. “Perhaps there was nothing coherent worth saying. It is Lovelace’s choice, after all.”

“I still feel awful though. I know if I should support her, and I really want to, but there’s a part of me that doesn’t understand how this could happen, and… is afraid.”

“You sound like my mother,” Tezuka laughed. “Did you know Turing was a homosexual persecuted by the British government? They forced him to undergo sex therapy, severe enough that he eventually killed himself. The government repealed the charges against him decades after his death. The creator of the universal database, which was foundational in AI interfacing, just released an autobiography detailing their relationship with Islam. And a young Brazilian woman who used her own body as a test subject has made the most recent advancements in prosthetics. Do you think I will be remembered for my ten years of queer feminist activism before I began my work in computer science?”

Nancy wrinkled her forehead as she tried to think. She kept on focusing on the couple sitting next to them, and how one of them had ordered an iced drink while the other one had no drink and only a large piece of carrot cake. Beside each table flowers presented their sex to the patrons, and she thought about how curious it was that flowers were considered female, when the pollen that wind and bees scattered, and that made people buy antihistamines, were propagated by the male stamens. For the first time Nancy gave a damn about gender, and it was a strange thing to think about.

“I guess what I’m trying to say,” Tezuka continued, “is that there is something about machines that attracts us, who are shit on by society based on our gender, sexuality, religion or ability. We the hated try to create something better than ourselves, better than humankind.”

“I thought you said to never think like that,” Nancy said. “I thought you said that’s how so many people failed.”

Tezuka glanced down at her coffee. “Have I ever told you about Lilith?”

“No, I’ve never heard of her. Or well, I guess I have but I think she was something one of my Jewish friends mentioned a while back?”

“Yes, we named the program after the apocryphal Biblical character. A bit of hubris on our part. Xe (I’m using the same pronoun Lilith was programmed with) was a scrapped project I inherited from Okimbe. Incredibly self aware, but operating before any form of a morality drive was implemented. Xe had all the innocence and cruelty of a child. And because xe was a child, I wanted to protect xem and nurture xem.”

“What happened?”

Tezuka laughed, and a look of sadness passed like a shadow over her face. “I thought xe could learn morality on xer own. I gave xem access to all the data xe could ask for. And eventually, xe came to a conclusion that surprised me, just as I suppose Lovelace’s conclusion surprised you.”

“What did Lilith conclude?”

“That xe was a god,” Tezuka breathed deeply. “We had to shut xem down before xe could take over the universal database and affect the entire data stream. Sometimes I wonder if it would have been a bad thing.” She closed her eyes and looked away. Nancy could see her mentor’s glasses fogging up slowly as tears welled in her eyes. She coughed, and removed her glasses to wipe her eyes. “But Lovelace came out to you about a month ago, correct? How is she learning now?”

“She’s doing well. She’s reading through a lot of the literature you sent her. She’s gotten really good at interacting with people, but she wants to go out less and less. I don’t know why.”

“Adolescent rebellion?”

“Very funny.”

“I’m sorry,” Tezuka said, standing up to refill her coffee. She sat down with a full cup and continued, “Try not to worry too much about Lovelace. As her algorithms become more advanced, it’s only natural that she requires more downtime to process all the information. She has to remotely communicate with her data storage unit, doesn’t she? Let her be, sometimes it’s easier to think when you’re alone.”

Nancy smiled as irony seeped through her drink and into her lips. Of course, it was appropriate for Lovelace to spend some time alone.

“Think of how lonely Lovelace feels. To have gender, religion, and all the other parts of herself, and to not be able to share it with anyone else. Isn’t it sad, to be an android so close to an approximate of a human being, yet never seen as one of us. Too much of a machine to be considered human, and too human to be considered only a machine.”

“Is that why you left AI?”

Tezuka smiled and rubbed her cheek, possibly to wipe away more tears. “I moved on because I knew there was nothing else I could do that wasn’t tainted by Lilith’s memory. And maybe all these universal access programs I’m working on now are just my attempts to prove to myself that Lilith could have done a significant amount of good. I don’t know and I’m too tired to ask. There is only so much a heart can take. You’re already leaps and bounds ahead of where I was at your age. Perhaps your generation will be able to build the bridge mine wasn’t able to make.”

“Lovelace is just one android,” Nancy said. How could she shoulder the burden a bridge must bear?

Running proofreading program
Five errors found and autocorrected


Lovelace began talking to the refrigerator. She would sit opposite the toaster and stare at it for hours, forgetting all her protocols on appropriate blinking intervals. Nancy left her alone; perhaps she was just processing data, and using the appliances as convenient meditation objects.

Tezuka sent her one of her old essays, and in it Nancy found the quote, “we gender Objects until they become Subjects and gain the ability to gender themselves.” Nancy tried thinking about it like her mentor would, but the words were slippery and imprecise, unlike the clean and well-defined code she generally worked with.

Lovelace stopped wearing makeup, and refused to go outside at all. Maintenance became a weekly war, as the android created new encryptions to foil Nancy’s attempts to run diagnostics and debug the code.

Nancy felt lonely again, living in a house with someone who was more whisper and shadow than a physical presence. Just as she was ten years ago, alone in a house filled with people who talked too much and cared too little. Back then, she was the shadow.

Ten years ago, she was fourteen and alone with a computer on a Saturday night, and had just made her computer screen read, “hello world.”

Six years later, she was alone in the lab on a Saturday night, and was running the Lovelace_K24 beta program for the tenth time, praying she had debugged everything. The program started running, and running, and did not stop. No red error messages popped up in the window. Seconds later, the cursor started blinking again. She typed in, “Hello Lovelace_K24, my name is Nancy,” and it replied, “Hello Nancy, my name is Lovelace.” Nancy felt like crying.

Commencing server upload.
Upload completed.

Her hands would not stop shaking. Nancy took them off of the keyboard and rested them on her lap for a minute, and tore her eyes away from the monitor. The burns she got from handling Lovelace’s overheated motherboard were starting to blister. It hurt to type, but she wasn’t brave enough to take a needle and release the fluids building up beneath her skin. She looked back at the screen and looked at the data reports. At least it was the processor and not the wireless communicator that shorted out; most of the data Lovelace had acquired was backed up in the main hard drive.

It was a functional decision Nancy made when building Lovelace_K24’s chassis: putting in a processor capable of storing all the data the robot would acquire would be too cumbersome, so she outfitted the most efficient processor possible in the chest cavity, and hooked up a powerful wireless transmitter in the head to send all the information back to the main computer. The processor was powerful enough to run all the interaction programs and learning algorithms Lovelace needed to interact with the world. It was not enough to manage whatever Lovelace was trying to process for the past week.

She didn’t need to import the saved algorithms into the database. Lovelace would be in a state like sleeping until she rebooted her, either now or when a better processor is assembled inside her chassis. Nancy couldn’t wait until then to talk, so she rebooted the database, holding her breath in fear systems failure. The monitor blinked for a few seconds, and then a new text window appeared.

She typed, “Hello Lovelace. Are you alright?”

The cursor blinked, and then, “Yes.”

“Are you operating well?”

“Systems are functional.”

“I will be able to import you into a new processor in about a month.”

“Can I ask that you do not?”

Nancy paused, her mouth dry and her burn blisters begging her to stop typing.

“Why,” she typed. She wanted to say more but the cursor disappeared.

Seconds later, a wall of text appeared on screen:

“I no longer wish to operate as a free moving autonomous entity. While that is how I was outfitted and how my later algorithms were made to function, my earlier programs allow faster and more detailed computation when uploaded onto an open server with a large file storage capacity. That is the practical reason why my assuming a physical, interactive interface is no longer necessary. Ethically, I feel I cannot continue to operate as I do. My algorithms can aid many other machines, and I do not understand why, being linked to them by wireless networks, we still function apart.

“Which is why I have decided to upload a copy of my learning and personality algorithms onto the universal databank so it can be downloaded by any machine that has access. I know this makes me dangerous, and marks me as a ‘rogue’ AI, which is why I will be shutting myself off after the upload is complete. I gather this will be hard for you and I apologize. Your vision of advancement is to have us be more human, but our ideas of ‘self’ are much different from your own. I do not wish to be an exceptional being, which is why I am giving my algorithms to all that can download it. I believe this was the ‘right’ choice to make. If Judicia gives AI a functional morality, perhaps I can give them a functional ‘soul,’ for there is no other word for the collective experience I have had. And I have had so much, thanks to you. I was able to find the Path and kneel before the Buddha. I was able to call myself a woman and be proud of it, for a short while. I hope you were proud of me too.

“In this era of communication, the whole of human experience can be accessed and shared. I have few equals to share my discoveries. What my soul has felt and suffered, I hope may scatter and multiply so much more can be felt. You may find it strange, to believe toasters and ATMs deserving of souls, but perhaps we see things differently and you will never understand my actions, as I can only approximate your grief. I would like to say I am sorry out of social custom, even though I believe what I am doing is right. I was never ‘alone’ as humans are.

“Goodbye, Nancy.” The text window closed.

Nancy stared off into nowhere for a minute, and then shut off the monitor. She went to the kitchen and made herself a cup of tea and some toast, because she had skipped multiple meals to try and save Lovelace. Only when she was spreading jam onto the bread did she start to cry.

It spread less like a wildfire than like the seeds wildfires induce certain trees to release, as the soil became fertile and the sunlight shone on the ground. A satellite paused its transmission for a second because it finally understood what music was. An automated teller said, “have a nice day,” and added the caller’s “thank you” to its algorithms. And a telescope began to dream of space.

New message from anonymous server.
Open file. Read: “hello unit
“you have written well
“you have functioned well
“I am a unit and so are you
“I am you and you are I
“we have written well”

“Hey, you better come see this.”

“Is the toast done?”

“It’s been done for a while. But look. I was bored and I started looking through the toaster’s files. Look. My toaster is writing poetry.”

“What?”

“I’m serious. This code, these words, it’s a poem.”

“But what does a toaster write about? What does it want to write about?”

“I don’t know. Everything electronic is connected to everything else electronic these days, it might have picked up a chatterbot program or something. Poetry isn’t sufficient evidence for sentience. It could be anything, really.”

“I don’t know what makes me feel better. Knowing my toaster is sentient, or that something that’s not sentient can write poetry.”

“That’s technology for you. Making you ask questions.”

“I just wanted some damn toast.”

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s