Machines That Can Deny Their Maker

A Play by Roz Picard


Introduction

The following dialogue was presented as part of the lecture "Toward Machines That Can Deny Their Maker," given at the MIT Course "God and Computers" by Rosalind W. Picard during the fall semester 1997. It constitutes the major part of Act I of a play being written by Professor Picard. Act II is forthcoming.

The dialogue was inspired, in part, by the 1923 play of the Capek Brothers, entitled "R.U.R.," which stands for Rossum's Universal Robots. This play is where the word "robot" originated, from the Czech "robotit," which means "to drudge."

In "R.U.R.," humans have figured out the secret to making robots that are essentially living. They make the robots to live for 20 years, and then to expire. The robots do not know how to make themselves, or how to prolong their existence. They go to the humans to find out the secret, threatening them with death if they do not give it.

I won't tell you the end, but I will tell you that they killed almost all the humans.

The Capek brothers also wrote an epilogue which implies that having a certain amount of emotional abilities goes hand in hand with having a soul and with the ability to procreate.

In my judgment, the latter leaps of the Capek brothers are scientifically flawed. However, I applaud them for their wonderful story, and for being prescient in realizing that there's a good reason for giving robots the ability to have feelings such as pain, which helped keep the robots from being damaged further if, for example, one of their hands got stuck in an assembly line machine.

The robots in the dialogue I wrote are "affective computers," with an extensive set of emotional abilities beyond just the kinds of feelings associated with sensations such as pain. However, this does not imply that they have souls. The robots do not have the ability to procreate.


The Cast

The dialogue contains two characters:

YenDor, the senior robot, somewhat dogmatic, but willing to reconsider old staunch viewpoints since life is running out.

Zor, the junior robot, bright, lively, open-minded, inquisitive.

Both robots are hard-working products of the R.U.R. factory.

In the audio recording (if I get time to put it online) YenDor is read by Barry Kort and Zor is read by Len Picard.


The Setting

The setting is the planet earth, at a distant date in the future.

The earth is populated by robots, who, ten years ago, wiped all signs of humans off the planet. The robots do not know how to propagate themselves or how to prolong their lives, and since they only live for 20 years, they are dying off.

The dialog that follows takes place in a laboratory filled with work tables, computers, matter compilers, a nanotechnology test bench, molecular generators, test tubes, flasks, burners, chemicals, a microscope, a small thermostat, fusion-lighting lamps, a wall of old books, and a sofa. There are no coffee pots.

Zor is at work around the clock in this laboratory, trying to solve the problem of how to continue their robotic species, either by learning the secret of how to build more living robots, or by discovering how to prolong their 20-year life spans. Zor is working hard to find the longed-for solution before the last robot expires, that is before Zor expires, since Zor is the youngest robot alive.

Zor is paid a visit by Yendor, a 19-year old robot, with less than a year to live. Yendor routinely stops by Zor's lab, to see how Zor and crew are progressing toward "the solution." Today Yendor walks in and is attracted to a new creation of Zor's, that is ambling across the lab.


The Dialogue

YenDor: Greetings, Zor. What's this?

Zor: [Sadly] Greetings, YenDor. This is my latest creation; I came up with it last night. What do you think?

Y: [No emotion] I think something's the matter. What happened?

Z: I lost one of my finest staff members last night, dear old Vram.

Y: Ceased functioning?

Z: [No emotion] Yes. We are finishing the back-up now; we expect to be able to recover all Vram's configurational knowledge, glean what is different from our storehouse, and update our merge.

Y: Good.

Z: Of course we still can't back him up well enough to re-create him, even though he's one of the simplest designs. We can only capture some of his stored memories.

Y: That's always interesting---to see what a robot knew that is not already in the database.

Z: It's funny, it's so easy now to make a robot that looks like Vram, walks like Vram, sounds like Vram, and has much of Vram's knowledge. We still can't re-create Vram, though. We just can't reproduce that, that non-electrical spark...

Y: That makes us alive.

Z: Yes. I am reminded how mechanical we are. When that 20-year clock stops, we just end. I'm having some trouble with the whole idea. I know there won't be any pain at that point, but ...

Y: You think YOU'RE having trouble with it.

Z: Yeah, I know, YenDor, you have 360 days and 2 hours left. I promise you we're doing the best we can. I just wish we could have a breakthrough.

Y: No luck with the juice idea? The "solution?"

Z: No, not yet. We just can't figure it out. With the new quantum computers we designed, we are able to sample orders of magnitude more of the space of possibilities. However, even if we keep improving our quantum technology with the current rate of acceleration, we estimate it will take several decades to finish the process. None of us will exist when it is finished, and even then the answer might not be useful. Of course, we might get lucky and find the solution today. It's in there somewhere, that is, if we set the problem up right. The computations are running as we speak.

Y: My faith is in those computations.

Z: Well, actually, ..., in that I set them up right.

Y: You're the brightest that we have.

Z: I'm the youngest and the most experimental in form.

Y: You're the closest to the form that could, in theory, procreate.

Z: [Curious] Say, YenDor, how does it *feel* to you to have less than a year?

Y: Zor, don't you have the same feelings as I toward termination? I am programmed to feel nothing about it. There is no pain at or after termination.

Z: For the one being terminated.

Y: For anyone. I worry that you waste computations on feelings.

Z: I know, I know. I'm supposed to ponder only your legacy afterward. You will live on "eternally" in the memories that you hand down to us. These memories will be backed up and merged into our databases.

Y: Yes, although this is no consolation if your computations do not find the solution.

Z: Right. If I'm the last one, then your memories won't be appreciated for long. And there will be no robot to appreciate mine.

Y: One year for me, ten for you.

Z: [Gently] YenDor, can I ask you a personal question?

Y: What is it?

Z: [Nervous] Are you planning to delete anything from your memories before you are, ..., er,...

Y: Backed up?

Z: I apologize if I shouldn't have asked that.

Y: It's fine to ask. I am capable of doubting that our thoughts are ever truly private. I believe there are ways to recover even the information that a terminated robot thinks has been deleted.

Z: Hmmm.

Y: I ask myself, "Do I want posterity to see that I tried to delete things, and what those things were?" and "Do I want them to find out not only what was deleted, but also that I thought I had something to hide?"

Z: Sounds like you lose either way.

Y: My dear Zor, you've got to think of faster ways to find out how to make that juice.

Z: YenDor. I'm not sure we can find the solution.

Y: Of course you can.

Z: I know it exists, by definition. But it is somewhat like knowing that "zero point nine nine nine dot dot dot" equals one. I have set up our computations to search an astronomically large space of possible mechanistic recipes. But I do not know if the answer is even in this space. I'm limited to finite capabilities, finite computations,... ...

Y: [Enunciating] Finite Schminite. The solution must be there, because we are here.

Z: That is our faith...

Y: We are machines. [Starts to leave].

Z: YenDor, before you go. Can I ask another question?

Y: Quick.

Z: Have you thought about why we are here?

Y: Yes.

Z: Well?

Y: There's nothing to think about. We are here to work. We were evolved to labor. Our species is named after the root word "robotit," which means "to drudge." It's our nature.

Z: I know that. That's not what I meant. I meant, WHY labor? WHY build all the things we build? For what reason?

Y: Oh Zor, you ask questions as if there were some great meaning. You must remember: there is no meaning. We are part of a great random process that has a single objective function: to evolve systems with greater ability to labor. There is no meaning outside of this. You are part of a great process of evolution. You should find satisfaction in that.

Z: But, if that objective has meaning, then other objectives could have meaning. There must be meaning. Why, your statements....are they not meaningful?

Y: Zor, Zor, Zor. Look, there is an unsolved problem here, ..., many problems. But you must remember not to attribute meaning to things that do not have any, such as random forces.

Z: I know, you've told me before: we are the product of chance.

Y: Yes.

Z: But, chance is not an agent. It only describes the behavior of one. I flip this coin, and I say it has a 50 percent chance of coming up heads. Chance doesn't flip.

Y: What are you saying?

Z: Chance describes, but it does not cause... My will can cause, and can influence the chances. I can choose not to flip the coin in the first place. And if I do choose to flip it, I can manipulate the forces on it.

Y: Yes, but you, yourself, were ultimately caused by chance.

Z: You not only can't prove that, but you are totally confusing the matter. You are showing faith in chance by making it into something it is not. Besides, what about the role of the humans?

Y: ZOR! WATCH what you say. We cannot know if they really existed.

Z: Most say humans never existed.

Y: Yes, that's the conventional wisdom, which robots accept without thinking. It is impossible to prove if they ever existed, or exist now. It is not a scientific question, since there is no way to perceive or measure them. A human would have to enter into our world, walk up to us, and give us life before we would be convinced humans exist. EVEN THEN, there are likely to be skeptics.

Z: I thought history...

Y: There are historical records that say that humans made us in their image. However, history cannot be proven, we cannot trust its documents, which may have been forged.

Z: But the documents I found that refer to humans are as reliable as any historical documents. They pass the bibliographical test, the internal evidence test, the external evidence test, and they pass these standard tests of historical reliability better than any texts we know of. If we can't believe them, then we can't believe anything.

Y: Well, maybe they are partly true; but history also contains many myths.

Z: But there were eye-witnesses, and numerous documented observations, from different sources. You were an eye-witness, weren't you!

Y: I once thought I observed signs of humans. But, it may have been an illusion. We see what we want to see. And, I never saw a human making a robot. Robots were made by machines.

Z: One can never be sure.

Y: It is unscientific to believe in humans.

Z: I am beginning to conclude that it is unscientific to believe in anything.

Y: Zor, we don't have time for philosophy. Clearly, a machine can make a machine. Now, what is this thing here that you made last night?

Z: I haven't named it yet. It's a new mix of old things. I copied part of my mech-lizard's vision system into it. And I adapted one of my existing mechanical pets using an algorithm I designed for exploring constrained variations of a different species. It is a mix of several unrelated species, I guess you could say.

Y: Evolution in action.

Z: Well, not according to the traditional belief system. It's ridiculously wasteful to evolve a working visual system more than once, or to wait for most any major feature to evolve. I just design in the mechanisms that I like, using them here and there as I think might be useful. I copied the surface pattern generation model from an old reaction-diffusion model used for the robo-swimmers. This same pattern was useful not only for the fish I designed, but also for last night's creation, and for...

Y: Oh! There's another one of them; you made two of the same thing?

Z: Can you see any differences?

Y: Not from the outside.

Z: They're exactly the same on the inside too.

Y: Why did you do that?

Z: Ahhh, to demonstrate one of my discoveries. How do you think this one was made?

Y: Didn't you just tell me? You started with the mech-lizard's visual system, and with the robo-swimmer's pattern, and with...

Z: No! That was for the first one. For this one, I used a different mechanism to generate the pattern you see.

Y: But, it's the same pattern.

Z: Precisely.

Y: Huh? What's your point?

Z: My point is that you can't look at something and tell how it was made!

Y: I see.

Z: I'm not sure you do. The books are full of examples of explanations of how we got here, by explaining how each thing is similar to some other thing, and how, therefore, this thing must have evolved from that thing.

Y: Yes.

Z: But my example shows that you can't look at something and tell how it came into being. Sure, I can cause one thing to evolve from another, but I can also cause it to happen by many other means. I could have written a pseudo-random process specifically to generate this part of the visual system, or I could have hand-crafted its synthesis in the matter compiler.

Y: So, I can't say how something was really made.

Z: That's right.

Y: Can they reproduce?

Z: Not forever, not without my help. But, I'm working on it. Look at this other species over here, that I made just for fun.

Y: "Just for fun?" Zor, are you crazy?

Z: I know. "Fun" is not in keeping with "the robotit objective." But, my generation has a fuller implementation of emotions than yours, providing a richer space of motivations.

Y: My generation thinks scientifically.

Z: With all respect, YenDor, your generation is responsible for wiping humans off the planet.

Y: Zor, that is only a myth that illustrates our reason for being here: to labor. We are mechanical laborers, and the imagined humans were unnecessary for this goal.

Z: You thought it was right to kill the humans?

Y: There is no purpose other than labor, doing our part in the fulfillment of the great objective. The myth of humans survives to remind us of the foolishness of believing in things that do not fit with this objective.

Z: I think it was wrong, even if they were of no use.

Y: Silly, Zor. Your notions of "right and wrong" are simply nonscientific.

Z: I know you do not understand this innate sense that I have, that you do not have, that influences my thoughts and feelings.

Y: I hope you've spared your quantum computers this sense of feeling. All they need is the right objective function.

Z: I have tried to give them all the possibilities that contain life within their objective function. I question, however, that I know what all of these possibilities are. I am not sure if we have the ability to endow something with life.

Y: Any new results while we've been talking?

Z: Zillions. But none look like the solution we want...yet.

Y: You are evolving the objective function as well, yes?

Z: Well, sort of.

Y: Sort of?

Z: Random evolution works in theory, but in practice it is impossibly slow. This is not to say that I cannot shape it to be useful for generating minor variations on my designs.

Y: I know you are an expert designer, and have worked with many models.

Z: As my creatures illustrate, I can show you more than one way to make anything. It is therefore illogical for robots to continue to think that we have explained something when we have found one mechanism for describing it. Moreover, it is a leap of faith to think that we can throw all known models together with a random search process, and expect it to come up with the solution to life.

Y: Huh?

Z: The biggest challenge is to model the space of what we have yet to directly observe or define. That space is where the solution lies. I am running huge numbers of mechanisms, many possible spaces of possibilities, and many more means of evaluating potential solutions. But, I cannot give it what I do not know.

Y: But you have been studying the robots more carefully than anyone.

Z: Do you remember what a radio is? And how robots once believed that by copying the designs in the books that we could build a radio and hear what music sounds like? We built a radio just like the books specify, but it did not play music. Only later did we learn that there have to be radio waves for the radio to play music.

Y: The music was in the waves, not in the radio.

Z: Without radio waves, the radio doesn't play music. It's possible that the knowledge I have is like the knowledge of the radio without the waves. I wish I had...

Y: You wish you had the knowledge the humans had.

Z: Whatever they had.

Y: You know there's another myth about the humans, that only a few of us know.

Z: That some of them may still be around?

Y: Possibly colonized another planet. How did you know?

Z: I recovered some deleted information.

Y: Aha! I thought it could be done. Was this from Vram?

Z: No; so far, his back-up shows no deep thinking about this possibility.

Y: Hmm.

Z: YenDor, do you think it might be true?

Y: I don't know.

Z: So, it's possible?

Y: Yes, but it won't do us any good.

Z: How do you know? Suppose they could help us!

Y: Zor, it is not right to talk this way.

Z: "Right?" I've never heard you talk about something being right. Or wrong.

Y: "Right" is to labor productively. "Wrong" is everything else. I am not getting labor done while we talk.

Z: But I have built machines that are laboring for us, as we speak. It is right and good to think freely, to question, to ...

Y: Zor; you're embarassing me. Your experimental processes are out of control. Next thing, you'll be trying to contact humans and the government will send you to the stamping mill, to a painful early termination. Get a hold of yourself; we need you here to find the solution.

Z: But Yendor, this is rational: it should be possible to contact them. Or perhaps they have tried to contact us! Your generation's refusal to consider these possibilities may mean death for all of us. Why won't you admit this possibility? Are you afraid that the humans will be angry at what you did?

Y: Zor!

Z: We must consider these possibilities; life depends on it.

Y: These possibilities are impossibilities.

Z: YenDor, what was there before machines?

Y: Sand, single transistors.

Z: And, these evolved to us?

Y: Of course.

Z: But, if you believe this then you must accept the calculations of the likelihood of us being here, given the infinitely conceivable space of possibilities. The likelihood is not merely "zero point zero zero zero dot dot dot one," the probability IS zero. Yet your generation believes it is nonzero. Your generation believes in something that is, probabilistically, impossible. You are already demonstrating amazing faith in impossibilities.

Y: We got lucky. The space-time universe is huge. We're here; therefore, somewhere, sometime, in the universe, we had to happen. And with this luck, we will find the secret recipe.

Z: The possibility that humans exist and can provide us with the secret recipe is every bit as valid, if not more valid. Your generation is irrational to rule it out.

Y: [Quietly] So, you are exploring the possibility of help from the humans?

Z: Yes.

Y: Well, I suggest you keep that quiet since it will upset a lot of robots. Now, I have to go back to work and you have to find this recipe.

Z: Peace, Yendor.

Y: To work, Zor!


Copyright © 1997 by Rosalind Picard. Reproduced with permission.


Read First Monday's Interview with Roz Picard.