Was scheduled to run through October 25, 2020
“I need you to help me plagiarize myself,” pleads a human character to a machine in Lawrence Lek’s 2019 film, “AIDOL.” That line deftly encircles the concepts explored in “Uncanny Valley: Being Human in the Age of AI.” “Uncanny Valley” is a term of art, first conceived by a Japanese robotics professor, that refers to the deep curve of a graphed line measuring human emotional response and likability for humanoid objects. The closer an artificial figure resembles a human but doesn’t quite get it right, the more it provokes “uncanny” or strangely familiar feelings of eeriness and revulsion, and therefore the lower the likability index; hence the “valley” zone in the graph.
This concept is contextualized much more broadly in this exhibition. Rather than investigating human facsimiles, this ambitious, sprawling show fleshes out how the tools and concepts of data, artificial intelligence, machine learning, creative authorship and open source participation are both broadening and narrowing contemporary creative practice. Can genuine creativity emanate from something generated by a machine?
AIDOL, Lek’s first feature film, is screened in a small pitch-dark viewing room where you watch the movie while sunken into blobby pink-cushioned seats. A stunning visual potpourri, it feels like Lek took American Idol, the Super Bowl halftime show and William Gibson’s Neuromancer, threw in a bunch of top iTunes playlists, and buzzed it all up in a blender. Its plot centers on seeking fame, immortality, and the holy grail of the best popular song in the universe (based on AI) underscored by a tongue-in-cheek, yet nonetheless thought-provoking screenplay. The lines are often brilliant: “That song ending sounds just like everyone else’s — it’s perfect, this will go viral.”
Trevor Paglen unearths the darker side of data with “They Took the Faces from the Accused and the Dead…(SD18),” an overwhelming 50-foot façade of over three thousand mugshot photographs used by the government, without the subjects’ consent, to seed facial-recognition software. It shines a harsh spotlight on recent controversies involving algorithmic bias, and how blind reliance on the use of flawed facial recognition datasets can lead to such injustices as racial profiling and mistaken identity.
Conjuring up a striking manifestation of the work if your raw material is numbers and algorithms clearly presents challenges. A common strategy involves creating formal abstractions, such as Agnieszka Kurant’s machine-fabricated sculptures, Zairja Collective's gorgeous data visualizations, or Ian Cheng’s metaphorical extractions in “BOB (Bag of Beliefs),” where viewers are enticed to make virtual “belief” offerings to shape the behavior of an animated serpent. Other approaches, like Lynn Hershman Leeson’s, simply embrace the data form. Her “Shadow Stalker,” which riffs on Phil Balagtas’ “Identity Probe,” taps into personal data gleaned from your email to extrapolate your online portrait. As you stand before the interface after entering your email address on the iPad, your body-shaped text silhouette evolves, with the data dancing and flowing from scraping the web. But then you realize everyone else in the gallery is now also viewing your birthday, your sister’s name, that time you worked at Macy’s in college and your childhood address. It’s horrifying.
Simon Denny’s work rightfully obsesses over an ill-conceived but hilarious 2016 Amazon patent for a machine that looks like a giant, vertical birdcage on wheels. It is a cage, but for humans; it was designed to protect workers tooling around Amazon’s large warehouses from being accidentally injured by roaming, box-transporting robots. After all, human flesh and blood needs protection from the sharp edges of machines-gone-wild. In the “Document Relief Series” Denny crafts stunning miniatures of the cage device carved out of stacks of fused patent pages. Nearby, an Amazon worker cage patent drawing as virtual King Island Brown Thornbill cage (US 9,280,157 B2: “System and method for transporting personnel within an active workspace,” 2016), we use our phones or an AR device to discover a virtual bird, a King Island Brown Thornbill (an Australian species facing imminent extinction) trapped in a giant cage, flitting and chirping. It’s the quintessential canary in a coal mine.
Though wildly diverse in approach, these works collectively resonate an apt metaphor: the Frankenstein phenomena of becoming trapped, and potentially destroyed by the very force that once afforded you power. In the end, it seems, we all will need to be protected from ourselves.