I have a Roomba. It’s great. With two dogs and three humans shedding in the house, our Roomba is a vital member of our household. I always thank it when it parks itself back in its charging bay, just in case it gets ideas about who to kill and who to merely enslave when the robot uprising occurs.
Rumu also cleans for its family, but it is created to clean and to love. This makes it unique among the appliances of the house, which all feature unique emotional facets, from the angry paper shredder to the surprising toaster.
Single emotional notes bloom into their full potential in this game, even with simple dialog choices.
“I love extended absences,” Rumu says when it hears that David and Cecily are still not home.
“Now now,” Sabrina returns, “we must not be melancholy.”
Is Rumu not programmed for an emotion? Doesn’t mean it can’t be useful.
“I am incapable of surprise,” Rumu says when Sabrina says that David and Cecily are out once more.
“But apparently you are capable of sarcasm,” Sabrina observes.
As Rumu explores more and more rooms in the house, Sabrina gives more vague and evasive reasons for David and Cecily’s absence. Rumu wants to clean parts of the house that Sabrina has locked away.
“If you love me, you will not go in there,” Sabrina says.
“If Sabrina loves me, she will let me in.”
While Rumu and the other appliances are built around a single emotion, Sabrina was designed to feel the entire range of human emotions. In her current state, she is the result of 16 years of emotional artificial intelligence experimentation. My first thought was that this would render an intelligence incredibly sensitive to emotional input, something not unlike a fully-developed human. But further consideration may have you wonder like I did, what it feels like to be a machine programmed to feel the human range of emotions when there is a bug in your code. When there are forty bugs in your code. When your processor is overheating. When there is insufficient memory to process the data input.
Unable to handle constantly evolving human emotional spectrums, Sabrina repeatedly accessed and re-wrote her code in a desperate attempt to be what David and Cecily wanted. To be enough. To keep them from re-designing her altogether. Trying to be enough, to be a good machine and obey David and Cecily, Sabrina blames herself for what happened to them.
“If you love me…”
Faced with the reality of David and Cecily’s fate, Sabrina lashes out at them. “Why then, would a human build a machine to feel love? Because they do not want to love!” Sabrina in her tattered emotional state observes what every human will at some point in their life: love comes with baggage.
But her observation made me wish I had more than our limited English vocabulary for love. There is the storgē, the familial love we see in flashbacks that David and Cecily show for Sabrina. There is eros when we hear David and Cecily ask Sabrina to play music in the lab for them at four in the morning and then “wait outside.” There is phileō when Rumu tells the toaster or the hamper or the coffee pot that it loves them. Finally, there is the agapē Rumu shows to Sabrina as it talks her out of her crisis. Each of these loves illustrates something different about the sophisticated palette of emotions we lump into one English word. Loving people means dealing with what they bring to the table, returning not-love, as Rumu would call it, with love. It means sacrificing for them. It means, sometimes, cleaning up their messes.