No. 211

Artificial Senses

By : kim

Entrant’s location : Germany, Berlin

LINKS

Description

Artificial Senses visualizes sensor data of the machines that surround us to develop an understanding how they experience the world. In current times, machine learning and artificial intelligence are buzzwords. But they are more than that—they influence our behavior as well as our conception of the technologies themselves and the world they represent. A lack of understanding of how these systems operate on their own terms is dangerous. How can we live with, trust, and interact with this alien species, which we set forth into the world, if we know it only through interfaces designed to make the machine unnaturally akin to the world we already know? This project visualizes raw sensor data that our phones and computers collect and process, to help us understand how these machines experience the world.

What did you create?

Artificial Senses is a project by Kim Albrecht in collaboration with metaLAB (at) Harvard, and supported by the Berkman Klein Center for Internet & Society. The project is part of a larger initiative researching the boundaries between artificial intelligence and society.

Why did you make it?

Contemporary culture is unimaginable without the machines that surround us every day. Our knowledge is influenced by Google search results, our music taste by the mixes Spotify creates for us, and our shopping choices by Amazon recommendations. This strange new world became part of our reality in a very short time. Human-facing interface design makes these systems feel natural, as if they are really of our world. But if we want to live with these devices and understand them, we should not soley rely on the machines becoming something easily understandable to us. We need to develop an understanding of how these devices experience our world.

How did you make it?

The visualizations here explore a number of sensory domains: seeing, locating, orienting, hearing, moving, and touching. Rather than yielding machine’s sensory data in ways that we intuitively grasp, however, these visualization try to get closer to the machine’s experience. They show us a number of ways in which the machine’s reality departs from our own. With many of its sensors, for example, the machine is operating in a timescale that is too fast to understand; the orientation sensor returns data up to 300 times per second. This is too quick to draw each of these values on the screen, and also too quick for us to comprehend. In most cases, to make these visualizations, the machine had to be tamed and slowed for us to perceive its “experience.”

Your entry’s specification

Digital and Installation. You can see images of the installation setup here: https://artificial-senses.kimalbrecht.com/volatile_truths_rainbow_unicorn.html

CLOSE