{"title":"Sensing performance: from Balinese character to Japanese androids","authors":"Chris Salter, T. Ikegami","doi":"10.1080/23322551.2023.2207966","DOIUrl":null,"url":null,"abstract":"ABSTRACT This article examines recent work in machine performance in the context of an ‘Artificial Life’ research lab, linking the disciplines of visual anthropology, cybernetics, Artificial Life and Deep Learning-based artificial intelligence with the increasing interest in deploying ‘intelligent machines’ in artistic performance settings. As recent explorations of ‘machine vision’ in robotics demonstrate, cameras can be understood as instruments of capture and representation which no longer are simply recorders of images whose meanings are to be unlocked by human interpreters. Moving across a range of performative contexts, from anthropologists Gregory Bateson and Margaret Mead’s ethnographic work in Bali to experiments with an autonomous android in the Japanese lab, the article explores the camera as a cybernetically influenced sensing device that generates complex feedback loops between entities and their spatio-temporal environments. How does the camera as a sensing device enact a kind of visual performativity whereby technologically mediated subjects and selves are not recorded but, in effect, produced through interactive circuits, instruments and computational technologies? How might models of perception and observation that emerge from the social and natural sciences shape new ideas about the entangling of computational and real bodies spaces in the emerging practices of performance design?","PeriodicalId":37207,"journal":{"name":"Theatre and Performance Design","volume":"52 1","pages":"91 - 111"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Theatre and Performance Design","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/23322551.2023.2207966","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Arts and Humanities","Score":null,"Total":0}
引用次数: 0
Abstract
ABSTRACT This article examines recent work in machine performance in the context of an ‘Artificial Life’ research lab, linking the disciplines of visual anthropology, cybernetics, Artificial Life and Deep Learning-based artificial intelligence with the increasing interest in deploying ‘intelligent machines’ in artistic performance settings. As recent explorations of ‘machine vision’ in robotics demonstrate, cameras can be understood as instruments of capture and representation which no longer are simply recorders of images whose meanings are to be unlocked by human interpreters. Moving across a range of performative contexts, from anthropologists Gregory Bateson and Margaret Mead’s ethnographic work in Bali to experiments with an autonomous android in the Japanese lab, the article explores the camera as a cybernetically influenced sensing device that generates complex feedback loops between entities and their spatio-temporal environments. How does the camera as a sensing device enact a kind of visual performativity whereby technologically mediated subjects and selves are not recorded but, in effect, produced through interactive circuits, instruments and computational technologies? How might models of perception and observation that emerge from the social and natural sciences shape new ideas about the entangling of computational and real bodies spaces in the emerging practices of performance design?