Leslie Bow clicks through movie and television clips on her computer: scenes from recent sci-fi thrillers, including Cloud Atlas, Ex Machina, Battlestar Galactica and the hit British miniseries Humans. Each features a synthetic human — benevolent or sinister, depending on the plotline — designed to look like a young, attractive Asian woman. Then Bow, a professor of English and Asian American studies, opens a video.
Filmed at a Toronto tech expo, it shows a sweet-faced robot with distinctly Asian features named Aiko. As a group of rapt pre-teen boys circles the eerily lifelike female replica, dressed in a dark wig and prim pink blouse, they can’t resist pinching and poking her arms and face. Each jab prompts a plaintive reply from Aiko, triggered by the cutting-edge artificial-intelligence technology beneath her silicone skin: “Stop it.” “Ouch.” “Please do not touch my head,” “It hurts.”
Welcome to the “uncanny valley.” That’s what tech theorists call the point at which we human beings — normally charmed by creatures who look a bit like us (think of all those adorable anthropomorphized animals we love to watch conversing in cartoons) — begin to experience mixed, uneasy or antagonistic feelings toward robots and other human knockoffs that have gotten too realistic for our comfort.
This ambiguous terrain fascinates Bow, who explores how attitudes toward Asians are expressed in popular culture and how we humans relate, for better or worse, to nonhuman beings. Literature and semiotics scholars have long tracked how people from Japan, China and other Asian nations (typically ones at the forefront of technological innovation) are depicted in literature, film and other media. “Techno-orientalism” is their term for the tendency to use stereotyped Asian characters to reflect fears and fantasies about a future ruled by enhanced beings and fiendishly smart machines — Dr. Fu Manchu, the evil mastermind scientist featured in 20th-century movies and T.V. shows, is a vintage example of techno-orientalist caricature. However, surveying the current media landscape, and noticing “the sheer ubiquity” of artificial intelligence technology being “embodied by young, nubile Asian female facsimiles,” convinced Bow that this particular techno-orientalist trend deserves special scrutiny.
For centuries, Asian women have been associated with service and what Bow calls “affective labor”: caring for other peoples’ emotion-based needs. According to Bow, Asian-featured fembots are just the newest twist in the longstanding cultural “fetishization and overt sexualization of Asian women.”
Imagine a society in which infinitely replicable robots (the kind we see in sci-fi dramas now) roll off assembly lines in actual brick-and-mortar factories, created to assist us. If those robots are designed to look identical to (or almost like, in that disturbing, uncanny way) real-world Asian women, Bow says it’s reasonable to expect that ready-to-serve-you stereotype, well past due for retirement, will only amplify. Research shows views of others’ skills and capacities are shaped by the roles we see them (literally) playing in the world. “We determine relations of power and define what frame we’re going to put around people according to how we see them operating and being treated by others,” Bow explains, and the stories we tell to describe what we’ve observed.
Although she says she’s seen several movies and books that cleverly dissect the paradox of being both attracted to, and terrified of, artificial women (she’s a fan of Marjorie Liu’s Monstress graphic novel series, about a nonhuman matriarchal Asian society fighting against its own subjugation), Bow says that often artificial Asian women are presented in popular media — with little or no nuance — as “both a toy and a servant,” a thing to be commanded and played with, like a doll. In the movie Ex Machina, for example, an Asian fembot named Kyoko has been programmed to serve dinner and dance seductively, but is pointedly denied the power of speech. The problem with such flat, simplistic depictions is that they gradually shrink our frames of reference about real people with thoughts and feelings who behave in multi-dimensional ways.
We determine relations of power and define what frame we’re going to put around people according to how we see them operating and being treated by others.
But as we stand on the brink of having AI robots much more present in our lives (at least according to all those bullish tech prognosticators), Bow thinks it’s also helpful to think about possible ramifications of cranking out machine assistants resembling any specific type of living, breathing human being — Asian, black, white; tall, short; male or female—and how we might relate to these new, super-smart machines. What would happen if, say, black male-replica androids were purpose-built and assigned to perform only rugged outdoor labor — an unsettling scenario similar to one the show Humans provocatively proposed? Could greater exposure to such historically painful, racially charged imagery increase discrimination against black men? And if signal pieces of so-called “disembodied” AI tech already on the market — such as Apple’s iPhone and Amazon’s Alexa — work perfectly well resembling no living person, “Why do we have to imagine AI as embodied at all?” Bow wonders aloud.
Bow says these are all the kind of complex questions she hopes her students — whether they’re aiming to work as tech engineers, Hollywood screenwriters or advertising execs charged with casting commercials — will ask themselves for years to come. But at a time when many Americans are reckoning with issues of racism and sexual assault and harassment in our society, classroom discussions about such things are taking on a new sense of urgency. Lately, when Bow has shown the video of Aiko being prodded by the boys at the tech expo, students in some lectures have cried out, while others have sat in stunned silence. But pondering disquieting power dynamics at play in the uncanny valley can also make it easier to see how the same dynamics unfold between real people.
“It is actually a great time to engage in thinking about xenophobia, racialized violence, the public and private abuse of women,” Bow says. “If students don’t want to talk directly about violation, fiction and fantasy offer a platform for raising those issues at a step removed.”