K. Lee, Eun Young Lee, Joo-seok Moon, Jina Jung, H. Park, H. Lee, Seung Ah Lee
{"title":"Plantext","authors":"K. Lee, Eun Young Lee, Joo-seok Moon, Jina Jung, H. Park, H. Lee, Seung Ah Lee","doi":"10.1145/3414686.3427139","DOIUrl":null,"url":null,"abstract":"Many people enjoy keeping houseplants and get comfort from the presence of plants. Not only for aesthetics and medical purposes, plants also have many other uses in human history. As a result, plant ecology and its biological evolutions are closely related to human culture. Normally people perceive plants as static objects, but in fact they do move and react to their surrounding environment in real-time. Their responses are just too slow to be recognized and their communication methods just differ from ours. Therefore, people find it hard to understand the biological and ecological contents underneath plants. We imagine what would happen if plants can talk, see, and sense as humans do. Our team is composed of researchers from engineering, HCI, and media arts. Our biology-computer hybrid installation project is collaboratively created based on our imagination driven from diverse experiences and interdisciplinary knowledge. Based on the imagination, we give each plant a character and exaggerate plants' sense by adding electronic devices with text to speech (TTS) voice synthesis and physics-based visual processing. The cultural histories of plants are spoken with all different synthesized human voices generated by our AI-based voice synthesis system. Also, as human vision responds to light, we also imagined that plants could see their surroundings through leaves. Because photosynthesis takes place there. By capturing the image from a mini-camera affixed to the leaf and showing the result of image processing in LCD screens placed among plants, we mimic a vision of the plants. The electrical signal is measured when users touch the plant and it distorts the audio-visual outputs. The overall experience with this work may arouse users to think about plants as a dynamic living being, opening a gap for users to understand the underlying context of plants more deeply.","PeriodicalId":376476,"journal":{"name":"SIGGRAPH Asia 2020 Art Gallery","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIGGRAPH Asia 2020 Art Gallery","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3414686.3427139","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Many people enjoy keeping houseplants and get comfort from the presence of plants. Not only for aesthetics and medical purposes, plants also have many other uses in human history. As a result, plant ecology and its biological evolutions are closely related to human culture. Normally people perceive plants as static objects, but in fact they do move and react to their surrounding environment in real-time. Their responses are just too slow to be recognized and their communication methods just differ from ours. Therefore, people find it hard to understand the biological and ecological contents underneath plants. We imagine what would happen if plants can talk, see, and sense as humans do. Our team is composed of researchers from engineering, HCI, and media arts. Our biology-computer hybrid installation project is collaboratively created based on our imagination driven from diverse experiences and interdisciplinary knowledge. Based on the imagination, we give each plant a character and exaggerate plants' sense by adding electronic devices with text to speech (TTS) voice synthesis and physics-based visual processing. The cultural histories of plants are spoken with all different synthesized human voices generated by our AI-based voice synthesis system. Also, as human vision responds to light, we also imagined that plants could see their surroundings through leaves. Because photosynthesis takes place there. By capturing the image from a mini-camera affixed to the leaf and showing the result of image processing in LCD screens placed among plants, we mimic a vision of the plants. The electrical signal is measured when users touch the plant and it distorts the audio-visual outputs. The overall experience with this work may arouse users to think about plants as a dynamic living being, opening a gap for users to understand the underlying context of plants more deeply.