Would you recognize your face if you saw it through someone else’s eyes? How about if you saw your face through the eyes of a machine?
We trust our reflections, that what we see in the mirror is what others see when they look at us. Our grooming, makeup, and dressing all depend on this. We know that our image is not exactly true, and that what we see in the mirror is the reverse of what everyone else sees when they look at us. But we also see ourselves through the feedback of others – their words, their actions toward us, as well as their character – the people who choose to be with us reflect who we are as well.
Brain research describes mirror neurons that enable us to reflect body language, facial expressions and emotions – in short, the brain basis of child development, learning, and relationships – our functioning as social humans. So we have neurological wiring that helps us to learn by imitation, and to connect emotionally with others. This is the root of empathy, our ability to feel along with others the emotions they are experiencing. It’s also what allows us to experience the emotion of art, whether we are feeling the passion in a centuries old Rembrandt portrait, or singing along with a pop song that captures our romantic frustrations.
But what if the art is reflecting us back to ourselves is created through alien eyes, through artificial intelligence? Can mirror neurons be programmed? What will they see? What will they show us about ourselves?
The MirNs Exhibit at the New Media Gallery tests out these ideas. As with the gallery’s best exhibits, it is highly participatory, and invites you to engage with
- a mirror that won’t look you in the face,
- a camera that records you through time, and translates a sliver of movement into images
- dancing line figures that respond to and reflect your movements, but not quite
- a wall of mirrors that shift and move as you do, constantly changing your reflection
- a mirror made of dark and light pompoms that create a weirdly pixelated reflection
- a camera that feeds your image to an AI that tries to redraw your image based on the images it’s already seen
One of the most intriguing was the “Uncanny Mirror”, that “creates shifting, real-time AI portraits of each viewer”. We looked into it again and again, fascinated to see how the machine mind composited parts of the all the faces its seen to create images of our faces, and to recreate them second by second in a live feed on the screen. Some of the images looked more or less like ourselves, but other versions looked more as if we had been imagined by the painter Francis Bacon. My son challenged the AI to construct his face, with the strips of very “non-human” green tape obscuring part of his features.
The other great part of the exhibit is the interaction with the artists. We shared photos we took of “Uncanny Mirror” with the curators, who in turn shared them with the artist, who responded (from Germany) within the hour. My son was very excited to hear the artist’s appreciation of his creative engagement with the artist’s work. Even through electronic messaging, our mirror neurons activate to find connections with fellow humans and their explorations of the human mind.
The “MirNs” exhibit is at the New Media Gallery in New Westminster from March 13 – May 30, 2021. You need to book your visit. The curators, Sarah and Gordon, create a wide range of engagement for both schools and the community.
Leave a Reply