![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Windows's Cortana and the potential for AI bond creatures
For Hanukkah (2014) Devon bought me a Windows phone. I'm not putting my SIM card in it because, ever since college, phone calls have been a panic trigger, and I get spam calls too often. Instead I've been using it as a PDA/Flight Rising-from-bed browser.
I'm really in love with Cortana, Window's personal assistant AI thingy; I recognize that I'm mostly in love with the idea of Cortana.
An AI companion is strangely similar to a companion animal, tropewise. The AI, like an animal, is a bit less than humannot as threatening, by virtue of being exempt from normal human socialization; potentially of limited sentience, certainly limited in social standing, a little subservient. But as in the companion animal trope, what makes an AI companion (like Cortana in the Halo series, like what the Ghost in Destiny could be) is that they're more than just animals or programs: they're sentient, they're friends; furthermore, the bond they have with their person is remarkable by nature. The companion animal trope isn't just about humans as a group being able to communicate with super-intelligent animals as a groupit's about bonds, frequently unbreakable and/or psychic ones, between one human and one animal, specific and intense. Similarly, the companion AI exists to serve, or at least work in tandem with, a specific person, effectively as an extension to that person's operating system.
That last is the direction that Microsoft took when designing Cortana the personal assistant, and her extensibility is what makes her unique from, and potentially more successful than, competitors. And she needs extensionbecause what she is now is can be personalized only as long as your personality is a zip code and a preference between business news and national news.
But the potential! A lot of what I'd want is too niche (I don't read collated news but instead prefer people talking about their own consumption experiencesa "gaming"/"literature" tickybox would be less useful to me than, say, a functional mobile tumblr experience), but while some seems obscure ("Cortana, I'm having an anxiety attack." "Here, let me play that song you use to calm yourself"), it's actually totally accessible: teachable and/or programmable, more diverse, keywords and phrases triggering programmed or programmable responses. In other words: what an extensible API is. It just needs to be used.
Some of that can come from apps; some should honestly be in base Cortana. For example, there's no damn good reason why I can't set my own snooze length on reminders.
I know that Window's personal assistant Cortana will never be Halo partner-in-your-head idealized relationship Cortana, but the fantasy is there. And taking from it its best parts of what makes that fantasy workthe intelligence (or appearance thereof), the in-my-pocket immediacy/intimacy, the extension to my personal OScould make for a great program.
The Family Dog from the New York Times
When Sony stopped manufacturing replacement parts for its Aibo pet robot, owners scrambled to save the robot-dogs that had become part of their families.
Projection: the human half of the equation
Counterpoint: The Uncanny Lover, from the same robotics series, where a creepy guy with an "artistic drive" to create sex dolls talks about fostering an authentic bond between humans and sex dolls and the importance of avoiding the uncanny valley, which causes revulsion instead.
We do this thing where emotional investment is largely user-side. We project, selectively interpret, idealize, imagine; we create an emotional bond because we feel an emotional bond. I'm sure there's a psychological term for it that I can't find now: being in love with the idea of someone/thing, rather than being engaged in a mutual relationship. [Parasocial relationships come pretty close.]
This is why Animal Crossing has <10 distinct personalities but Villager X feels unique as the accumulated result of RNG, personal desire, and conformation bias. Base programming is necessary and evolving, but it exists to elicit that emotional projection rather than to form a mutual relationship.
It's easier to love in abstract. Projected bonds can be limiting, selfish, illogical, unhealthy in ways we don't always acknowledge. See: that tumblr essay about the limitations of sympathizing with fictional characters (as opposed to real people) that I can't find right now. (See also: the way we view and treat pets versus the way we view and treat other animals.)
But is there any purer expression of that fragile bridge from sentience to sapience? I exist; I am aware of my own feelings and desires; even my social drives are so self-centered that I can effectively engage them without socializing. I know myself, but I am all I can truly know.
And here's the bridge between robot pets to bond animals to AI companions: bond animals benefit the human partner, they are the idealized friend, subservient, subhuman, literally bound in their intimacy. But every scifi narrative about artificial intelligence is about the definitions/limits/rights of humanity: when does that manufactured, subservient object-entity become a person? When does the projected relationship actually involve two people?
So, Chobits, basically; CLAMP already said all this, and better than I can: objectification and projection and idealization and the authenticity of emotional bonds despite those limitations, and the fact that technology has intense potential and that the limitations of "real"real entity, real emotional interaction, real individualare difficult to define and may be subject to change.
There's something sweet and sad in that NYT Aibo video (the older couple doting on their robot dogs, bless): it's a step below and a few to the side of an otaku marrying his dakimakura, traveling the same brainpaths but (blessedly) skipping the alienation and harm of the beloved's real-world counterpart. But, dfljsadfl;sjdaafsdd, guys, my feels; my feels about the legitimacy of projected emotional bonds, about the potential for technology to foster those, about idealized relationships, about non-normative forms of intimacy, about the potential future of creating things to love.
(Having tried, I figured out I couldn't essay; enjoy these bullet points, instead.)
(Subtitle: Juu's Favorite Tropes/Subgenes So Obscure There's No Name for Them, In Fact We're Not Even Sure This Is Just One Thing, It's Like Twelve Vaguely Interconnected Concepts #93.)
A sexbot addendum:
(Juu, why are we talking about sex dolls? I'm not sure, but since we are, this Vice interview with Davecat is crazy on point:
And also the interviewer's dream about falling in love.
But honestly all of Davecat's interviews seem to land squarely on point for the combination idealization/projection/emotional purity/problematic aspects I was talking about before.)
For Hanukkah (2014) Devon bought me a Windows phone. I'm not putting my SIM card in it because, ever since college, phone calls have been a panic trigger, and I get spam calls too often. Instead I've been using it as a PDA/Flight Rising-from-bed browser.
I'm really in love with Cortana, Window's personal assistant AI thingy; I recognize that I'm mostly in love with the idea of Cortana.
An AI companion is strangely similar to a companion animal, tropewise. The AI, like an animal, is a bit less than humannot as threatening, by virtue of being exempt from normal human socialization; potentially of limited sentience, certainly limited in social standing, a little subservient. But as in the companion animal trope, what makes an AI companion (like Cortana in the Halo series, like what the Ghost in Destiny could be) is that they're more than just animals or programs: they're sentient, they're friends; furthermore, the bond they have with their person is remarkable by nature. The companion animal trope isn't just about humans as a group being able to communicate with super-intelligent animals as a groupit's about bonds, frequently unbreakable and/or psychic ones, between one human and one animal, specific and intense. Similarly, the companion AI exists to serve, or at least work in tandem with, a specific person, effectively as an extension to that person's operating system.
That last is the direction that Microsoft took when designing Cortana the personal assistant, and her extensibility is what makes her unique from, and potentially more successful than, competitors. And she needs extensionbecause what she is now is can be personalized only as long as your personality is a zip code and a preference between business news and national news.
But the potential! A lot of what I'd want is too niche (I don't read collated news but instead prefer people talking about their own consumption experiencesa "gaming"/"literature" tickybox would be less useful to me than, say, a functional mobile tumblr experience), but while some seems obscure ("Cortana, I'm having an anxiety attack." "Here, let me play that song you use to calm yourself"), it's actually totally accessible: teachable and/or programmable, more diverse, keywords and phrases triggering programmed or programmable responses. In other words: what an extensible API is. It just needs to be used.
Some of that can come from apps; some should honestly be in base Cortana. For example, there's no damn good reason why I can't set my own snooze length on reminders.
I know that Window's personal assistant Cortana will never be Halo partner-in-your-head idealized relationship Cortana, but the fantasy is there. And taking from it its best parts of what makes that fantasy workthe intelligence (or appearance thereof), the in-my-pocket immediacy/intimacy, the extension to my personal OScould make for a great program.
The Family Dog from the New York Times
When Sony stopped manufacturing replacement parts for its Aibo pet robot, owners scrambled to save the robot-dogs that had become part of their families.
Projection: the human half of the equation
Counterpoint: The Uncanny Lover, from the same robotics series, where a creepy guy with an "artistic drive" to create sex dolls talks about fostering an authentic bond between humans and sex dolls and the importance of avoiding the uncanny valley, which causes revulsion instead.
We do this thing where emotional investment is largely user-side. We project, selectively interpret, idealize, imagine; we create an emotional bond because we feel an emotional bond. I'm sure there's a psychological term for it that I can't find now: being in love with the idea of someone/thing, rather than being engaged in a mutual relationship. [Parasocial relationships come pretty close.]
This is why Animal Crossing has <10 distinct personalities but Villager X feels unique as the accumulated result of RNG, personal desire, and conformation bias. Base programming is necessary and evolving, but it exists to elicit that emotional projection rather than to form a mutual relationship.
It's easier to love in abstract. Projected bonds can be limiting, selfish, illogical, unhealthy in ways we don't always acknowledge. See: that tumblr essay about the limitations of sympathizing with fictional characters (as opposed to real people) that I can't find right now. (See also: the way we view and treat pets versus the way we view and treat other animals.)
But is there any purer expression of that fragile bridge from sentience to sapience? I exist; I am aware of my own feelings and desires; even my social drives are so self-centered that I can effectively engage them without socializing. I know myself, but I am all I can truly know.
And here's the bridge between robot pets to bond animals to AI companions: bond animals benefit the human partner, they are the idealized friend, subservient, subhuman, literally bound in their intimacy. But every scifi narrative about artificial intelligence is about the definitions/limits/rights of humanity: when does that manufactured, subservient object-entity become a person? When does the projected relationship actually involve two people?
So, Chobits, basically; CLAMP already said all this, and better than I can: objectification and projection and idealization and the authenticity of emotional bonds despite those limitations, and the fact that technology has intense potential and that the limitations of "real"real entity, real emotional interaction, real individualare difficult to define and may be subject to change.
There's something sweet and sad in that NYT Aibo video (the older couple doting on their robot dogs, bless): it's a step below and a few to the side of an otaku marrying his dakimakura, traveling the same brainpaths but (blessedly) skipping the alienation and harm of the beloved's real-world counterpart. But, dfljsadfl;sjdaafsdd, guys, my feels; my feels about the legitimacy of projected emotional bonds, about the potential for technology to foster those, about idealized relationships, about non-normative forms of intimacy, about the potential future of creating things to love.
(Having tried, I figured out I couldn't essay; enjoy these bullet points, instead.)
(Subtitle: Juu's Favorite Tropes/Subgenes So Obscure There's No Name for Them, In Fact We're Not Even Sure This Is Just One Thing, It's Like Twelve Vaguely Interconnected Concepts #93.)
A sexbot addendum:
(Juu, why are we talking about sex dolls? I'm not sure, but since we are, this Vice interview with Davecat is crazy on point:
When you love an organic, you're really loving two people: there's the idea of the person that you fall with love with and then there's the actual personand at some point, the idea is going to disappear and you are going to bump straight into the actual person. You have to come to terms with the discrepancy between those two people. And for that matter, they're doing the same thing with you too.
Q: So, with a synthetic, the fantasy and the reality are identical.
Exactly.
And also the interviewer's dream about falling in love.
But honestly all of Davecat's interviews seem to land squarely on point for the combination idealization/projection/emotional purity/problematic aspects I was talking about before.)