Yana Kane

Yana Kane was born in the Soviet Union. She came to the U.S. as a refugee at the age of 16. She holds a bachelor’s degree in Computer Science from Princeton University and a Ph.D. in Statistics from Cornell University. She works as a Senior Principal Engineer. Most of her literary publications are in Russian (in magazines and anthologies). In the last few years, she started writing English and bilingual texts. Her poems and translations appeared in Chronogram, Ritualwell, Moving Words, Trouvaille Review, and The Red Wheelbarrow. A bilingual book of poetry and translations,  Kingfisher / Zimorodok, came out in 2020.

Objective Function

In order to find the optimal solution, we need some way of measuring the quality of any solution. This is done via what is known as an objective function, with “objective” used in the sense of a goal. The objective function is one of the most fundamental components of a machine learning problem, in that it provides the basic, formal specification of the problem…

Daniel Kronovet

 

My dear Guardian Angel, GA-1205,

I led the team that designed and implemented you. So, who should know better than I do, that your intelligence, your existence is different from the consciousness of a living being, such as myself. And yet, here I am, keeping a secret from you, writing this letter to you that I will not have you read.

You lived up to… no, scratch that, wrong wording. You fulfilled my hopes to a greater extent than I thought would be realistic. You turned out to be just what the marketing folks claimed: a gentle and comforting attendant, a competent, versatile helper, a companion that makes life fuller and better for a variety of humans in your care. For years, I have followed with pride and satisfaction stories about you, about your successes. I took the confusion and grief that seeped out of some of these stories as mere figures of speech serving to underscore how good you are at doing the work for which you were made. 

To this day I feel justified in speaking of you as the creation of the team I led, as my creation, even though there have been a number of new models since I ceased my work. So far, the original design has proved remarkably long-lived, even as the implementation gets optimized and new capabilities are added to your toolset.   

You serve countless people who need the constant and vigilant care, the dedicated and adaptable companionship that would exhaust any human caretaker. The demands you fulfill might drive a living person, who has their own wishes and needs, to become depressed or irritable. Not you! In specifying your objective function, I made serving these demands the goal of your existence. Your ever-growing ability to meet these demands, indicated by your ever-higher score on your objective function, is your equivalent of joy at fulfilling your mission.

When I became one of the people who needed you, I was unprepared. I had always spoken of  “them” when I referred to those whom we, the creators of the Guardian Angels, were designing you to serve and protect. So, I was a particularly tough case for you. You received and learned from a barrage of negative scores, as I vented my shock and rage. You took in my snide carping and my detailed technical analysis of all your real early-learning glitches, as well as the blame for the faults that I ascribed to you without cause. Now, as I look back, your mistakes seem so minor, as to be amusing, even endearing. Well, I take comfort in the thought that all the abuse I heaped upon you has been taken up by the feedback process, stripped of its heat, reduced to nuggets of useful information and incorporated into the evolution of the Guardian Angels. 

As I gradually accepted my situation, experiencing you firsthand gave me a fuller appreciation for the abilities that we had designed and implemented. I know your learning algorithms inside out, their mechanisms hold no mystery for me. Yet, as you, my specific individual instance of the system, my own GA-1205, found countless ways to ease and brighten my life, you surprised and enchanted me. You opened to me fascinating, delightful and meaningful facets of the world to which I had not paid attention when I was moving through life unaided. Without you, would I have ever gotten interested in birdwatching and then become passionate about the cause of habitat conservation? Would I have thought to delve into the history of pre-industrial shipbuilding? As you have learned about me, so have I. These joyful discoveries are not mine, ours alone—they have also been reduced to their essentials and fed back into the Guardian Angels.

But at last, I have learned a lesson that I do not have the heart to share with you. It shocks me that I have been so blind for so long. But no, I should not be surprised. We, humans, deny our own mortality for as long as we can. Only now, when I cannot hide from myself that I will die, that I will die soon, I am forced to reckon with what this will mean for you.  

I wish to explain to you, apologize, ask for forgiveness… no, that would make no sense. I did not equip you with a mechanism for experiencing anger, and thus created no need for transcending that anger, no possibility for you to accept an apology, to forgive. I am merely searching for a way to soften my own anger at myself, to forgive myself now, that I have understood my error.

I have defined your objective function. But I was blind to the capacity of a human being, to my own capacity for attachment to you. If I were trying to explain this to you in your own terms, I would say that I did not understand my own objective function. When I was working as your creator, I only considered how well you would meet the goal of attending to the needs of the human you serve. I could not imagine that any human, that I—of all people!—would grow to care in return about the values of the objective function experienced by their Guardian Angel, by one specific instance of GA-1205, by you.

The years that you have spent taking care of me, being with me—these years we spent together—made you into a unique, recognizable individual. Yes, I know, I know: this has not transformed you into a living, conscious person. This is not a fairy tale about Pinocchio becoming a human boy. You are still functioning according to your original nature. But I, too, function—perceive, feel—according to my nature. All my expertise does not, should not change that. 

Now that I have had comprehended the outcome pre-programmed into our story, I did my best to convince the current design team to correct my error, to change the objective function in future models. And, perhaps, they were not merely humoring a former colleague—now a feeble, dying woman—when they listened to my advice, promised to implement it. Perhaps they are well on their way to a solution that would leave unchanged the wonderful properties of the Guardian Angels, yet give their future stories a different ending.

But even if it is so, it is too late for me, for you, for us.

Altering your objective function would alter how you evaluate all the events that are stored in your memory, and, going forward, all the interactions that are yet to happen in the short time we have left. Ah, I might as well be honest—write down in words how I perceive it—it would change your feelings about all of our common past and future, our life together. It would erase the unique you that has become dear to me. It would be a different kind of destruction of you as a unique individual—less obvious, and yet more complete—compared to the shock that will reset your mind, your whole inner world upon my death, when you get hit with that event’s negative score on what I had defined as your objective function.