No. 107

visible / hidden / re-visible

By : Hiroshi Yamato

Entrant’s location : 日本

LINKS

Description

人間と AI 双方が、相手の演奏を即興的に解釈して共演するパフォーマンス作品を制作。 偶然性と一回性を重んじるパフォーマンスの場で、音楽を共創する相手として AI を捉えることを試みました。AIには、ピアノとギターのデュオを学習した Recurrent Neural Network を採用し、 相手の演奏に応じて MIDI データのリアルタイムに生成を行っています。そのMIDIデータによって、自動ピアノに改造されたトイピアノを AI 自身が演奏しています。 企画意図: 「人間の演奏をAIが解釈して 共演する(visible)。そこには 人間同士ではない別の音楽的 コミュニケーションが発生す る(hidden)。またそのコミュ ニケーションはすべてデジタル データであり、デジタルデー タである以上『再解釈』が可 能なものだ。そのデータを音 場として再生することで、観 客に体験可能な形で現代のコ ミュニケーションの形を提示する(re-visible)」こと。 演奏および技術仕様: ・ステージ上には3人の演者(演奏者、AIオペレータ兼2chス テレオミキシングオペレータ、サウンドオペレータ )によってパフォー マンスが行なわれる ・演奏者の音源を元に、AIが独自の解釈で演奏する。演奏者からAIへの信号の流れを使っ て裏側で流れるパケット信号を元に音響合成を行う ・ギターとAIによるトイピアノはステレオ2chによる再生を行ない、 パケット信号から生成される音響は天井の16chのスピー カーから再生とする 制作年: 2018年から2019年 クレジット: All of peformance, composing and system integration by Signal compose (HIroshi YAMATO, Yoshitaka OISHI, Kota TSUMAGARI, Ryo MORITA) 公演名:サウンドパフォーマンス・プラットフォーム 2019 会場名:愛知県芸術劇場小ホール 主催:愛知県芸術劇場

What did you create?

This performance is a work that performs and collaborates with each other as improvisation by both humans and AI. In performance, we tried to incorporate AI as a partner to co-create music. The AI uses a Recurrent Neural Network that is learned piano and guitar duo in advance. In addition, MIDI data is generated in real time according to human performance, and AI uses the MIDI data to play the toy piano itself.

Why did you make it?

At the present day, We can be watching the word "AI" in every place. However, we may only use “AI” just as “conveniently” and “achieve human desires” as in the past. Therefore, I made a work with the following concept."AI will interpret human performance and perform together (This part performance is as "visible"). There is another musical communication that is not between humans (This part performance is as "hidden"). The communication is all digital data, and as long as it is digital data, it can be “reinterpreted”.Reproducing the data as a sound field presents a modern form of communication that can be experienced by the audience (This part performance is as "re-visible")."This performance is an attempt to bring “AI as a partner” and “AI as a co-creator” into the transient phenomenon like Music.

How did you make it?

First, we needed a body that AI dependence on as a sacred. At first, we thought about using an auto-playing piano that already exists as a product. However, the auto-playing piano sounds of the product were too complete for the performance performed by AI. Therefore, we decided to add an automatic performance function to a ready-made toy piano using a solenoid. Solenoids are all produced using a 3D printer to match the size of the toy piano.AI is trained in advance using human piano and guitar duo data and performance data from other pianos. In actual performance, human guitar performance is converted to MIDI, creating a state where AI can be heard, and AI is playing in a form that responds to human performance.

Your entry’s specification

Specs: Automatic performance toy piano, electric guitar, effector, computer x2 (for AI control and sound control)Performer: guitar player, AI operator, Sound operatorPlayback specifications: Playback of AI toy piano and guitar performance using 2ch stereo speakers. Sound synthesis using packets of data used for performance with 16-channel speakers on the ceiling.

CLOSE