Fast, colorful ghosts winding their way around the maze greeted me as I stared at the screen of a Pac-Man machine, part of the Museum of Modern Art in New York City’s “Never Alone: Video Games and Other Interactive Design” exhibition.
Using the smallest amount of RAM and code, each ghost is programmed with its own specific behaviors, which combine to create a work of masterpiece, according to Paul Galloway, collections specialist in the Department of Architecture and Design.
It was the first time I had seen video games inside a museum, and I came to this exhibition to see if I could get some insights into technology through the lens of art.
It’s a fair that’s more timely now than ever, as technology has been absorbed into nearly every aspect of our lives whether at work or at home — and what I’ve learned is that our empathy with technology leads to new kinds of relationships between ourselves and our robot friends.
The exhibition wants to show how interactive design “informs the way we move through life and visualize space, time, and connections, beyond the game screen,” according to MoMA. Announcing the show, she said that the interfaces we use to access the digital universe “are visual and tactile manifestations of the code that binds and separates us together, shaping the way we act and perceive life.”
During my tour of the fair, I followed cool video games — Minecraft, Tempest, SimCity 2000, Never Alone (Kisima Ingitchuna) to name a few — and stopped by to play with any consoles open.
Many games seemed simple at first, limited to a single joystick and a couple of buttons or a keyboard. However, when I tried to play it, it took me a while to learn the ways of the game. Some of them, especially Minecraft, made absolutely no sense, and I had to watch a kid play them to understand the game’s intricacies of world building.
Other museum-goers wandered into the toys, waiting for an open spot. When one of them did that, their eyes immediately stuck to the screen as they plunged into a new world with new rules.
He was most drawn to robots and gadgets, including a 1984 Macintosh SE Home Computer, an iPod, and an EyeWriter, an eye-tracking technology created by graffiti artists using ALS that allowed him to create markers of the city from his bed.
According to Galloway, the Never Alone exhibit is related to Iñupiaq’s video game included in the show called Never Alone (Kisima Ingitchuna). This idea came from the Cook Inlet Tribal Council, which represents the indigenous people of Alaska, and was created in an effort to preserve the heritage of their culture and connect with the younger community.
“They made a video game and the basic idea of the game is that by connecting with each other and our shared cultures we can find wisdom and peace, especially in the face of the challenges of a changing world, and I think that seemed like a complete metaphor,” Galloway said.
So, according to Galloway, there are two meanings of the Never Alone exhibition here. The first is that when we’re in a video game, we’re technically never alone, as the inputs, the players, and the designers are all the parts that have to work together for the technology design to work.
As players in the game, we are constantly interacting with the inputs that the designer has generated for us to explore such an interface. In this sense, it is impossible to be truly alone when we use interactive design.
The second thread is that – thanks to technology – we are never alone, even in the most difficult times, such as during a pandemic. We are constantly connected through technology, whether it is through connecting society to a culture or simply staying in touch with each other through the Internet.
This exhibition is a way to explore our humanity and how our relationship with technology can reaffirm our empathy rather than make us less human next to these robots.
Galloway told me the show is divided into three parts: the input, the designer, and the player.
“We thought about the three different parts of this exchange. There’s the actual machine, there’s the person using the machines — the user or the player — and then there’s the person who designs all the experiences,” Galloway said.
“Part of the reason this post-pandemic show happened is because we spent two years glued to our screens and interacting with each other through different software mediums, whether it be Zoom calls, Fortnite Battle Royale or playing amongst ourselves,” Galloway said. “These tools mediated our interactions with each of them, making us all interactive design professionals.”
For a while, many of us have effectively had to direct our interactions with each other through devices and screens. And the Never Alone exhibition also asks—perhaps unexpectedly—how much we can extend our empathy not just to devices, but to devices themselves.
One way to examine such interactions is through the Technological Dream Series: No. 1, installation of the Robots project by Anthony Dunne and Fiona Raby, located in one corner of the gallery.
An assortment of differently shaped objects—a red circle, what looked like a large showerhead, a curved rectangular wooden telescope, and something that looked very much like a lamp—were all sprawled out on the floor.
In the accompanying videoa woman stands next to these things, periodically picking them up, examining them, listening to them moaning, as if yearning for their attention.
Are these things supposed to be bots?
“Robots can take any form, again [we’re] Investigating our ability to extend empathy to these things that seem completely alien and inhuman,” Galloway said.
“It’s not like A Roomba floor cleaning device For you, instead, it’s a stupid robot that can’t even move. Galloway said all he could do was cry. How do we look at ourselves and extend our humanity into something in this way?
“I think that [the pandemic] He was so mediated and informed by screens, digital devices, and interactive software that I can’t think of all of those things the same after that experience.”
This exhibition is the perfect opportunity to examine our renewed sympathy and realize that perhaps our sympathy for these devices has, in fact, always been there.
For example, consider twinbot.
Tweenbot came from a project back in 2009 when Kacie Kinzer let this smiling little cardboard robot roam around Washington Square Park in New York City with the help of only a passerby and a flag saying “Help me,” pointing in a specific direction to help him get to his destination.
Surprisingly, the energetic New Yorkers on their New York itinerary stopped by to help the Tweenbot stay on track and unplug whenever it hit any snags.
Tweenbot made it to its destination and, surprisingly, didn’t end up in a hole somewhere in the city trenches.
Tweenbot would not have been able to complete its mission without the help of humans to guide it.
So, there must be something in us humans — walking the bustling city streets daily, without making eye contact with anyone — stop and take the time to get the little robot back on track again.
It seems counterintuitive for humans to help a robot (or any piece of technology) achieve a goal, rather than the other way around. After all, robots are supposed to make our lives a little easier. They can complete tasks ranging from simple to complex, such as cleaningAnd the make deliveriesand even cooking.
But Project Kinzer showed us that when roles are reversed and robots become the ones that rely on humans to get something done, humans are able to extend empathy to them. Perhaps this is a positive sign for all of us – that our interactions through technology can keep us connected with the people we care about but also make it easier for us to extend that empathy to the world around us as well.
#Video #games #robots #teach #surprising #lesson #listen