What did you create?
An AI robot tells your fortune. Omikuji are short personal fortunes obtained at Japanese temples and shrines. Exhibition visitors are invited to play the artists’ ‘Belief Machine’. You can receive omikuji in the form of a mini sound-artwork, made by the robot and delivered directly to your phone! Alter is an android robot with an experimental AI system that is stimulated by sensor input. It is created by Ikegami Lab (Tokyo) and Ishiguro Lab (Osaka). Alter can sense, learn, and sing. It uses a self-organizing neural network to classify its surroundings. Such AI strategies include deep belief networks, through which machines determine certain inputs to be believable. Alter is beginning to believe things about the world. Knox+Watanabe make art–science projects exploring sensory and computational perception and conviction in robots and artificial lifeforms. We observe how a nascent robot learns and embodies its ‘beliefs’. If the goal of artificial neural networks is for machines to discern phenomena in a humanlike way, Alter is outputting its discernment of sensory data via its machine body; by this performativity, its belief and behavior evolve.
Why did you make it?
In robotic intelligence that mimics human ‘deep belief’, machines are trained to recognise characteristics and then to recreate (infer) them by probability, eventually being able to perform classification tasks. They ‘think’ in layers of computational variables which are inferred from initial, lower layers of direct-measured values — yet each sub-network can only ‘see’ the one that came before it. Layer by layer of accumulated belief, the deep layers of the machine-brain act as truth filters (‘feature vectors’), guiding the learning process. Belief systems are maximally contentious in our globalised world, and are important to both the inter-harmony and the preservation of culture/s. We create artworks to uncover and express the layers of Alter’s budding belief system, in their naïve mutability and contingency, even their idiosyncrasy. People may be prompted to ask: How are we transmitting our beliefs to and through machines? How sure are we in our beliefs? How soft are they, and how hard?
How did you make it?
We built an application — the Belief Machine — by which visitors to our installation can communicate with Alter using a touchscreen. They can see and hear Alter, and Alter can hear them. This application is used to transcode people's answers to various questions into binary code and then into sound that Alter can hear in Tokyo. As Alter's aural environment is altered, so its sound output alters (via its inbuilt AI), and it sings in response to what it can hear. This song response is recorded and sent immediately to each participant as a mini sound-artwork that they must themselves interpret as a fortune. Isn't that always they way? Each participants also receives one of the 12 levels of luck inherent in Japanese omikuji. Software used is primarily Touch Designer and Max MSP.
Your entry’s specification
2017 robot, AI, sound, live streaming, electronic 'belief machine', interactive experiment 3x3m or larger installation includes custom plinth, touchscreen, custom application, sound, projection Omikuji was commissioned by Goethe Institut, and is supported by the National Museum of Emerging Science and Innovation (Miraikan) and Watanabe Lab Waseda University through the Japan Society for the Promotion of Science and JST Crest. Credits: Technical (belief machine): Boris Morris Bagattini, Lindsay Webb Robot: Takashi Ikegami, Hiroshi Ishiguro Thanks: Itsuki Doi, Kohei Ogawa, Yukio Yanagawa, Nana Chen, Jenna Lee