{"id":1445,"date":"2021-02-15T09:15:03","date_gmt":"2021-02-15T09:15:03","guid":{"rendered":"https:\/\/tei.acm.org\/2021\/?page_id=1445"},"modified":"2021-02-15T09:15:05","modified_gmt":"2021-02-15T09:15:05","slug":"proceedings","status":"publish","type":"page","link":"https:\/\/tei.acm.org\/2021\/proceedings\/","title":{"rendered":"Proceedings"},"content":{"rendered":"\n<html xmlns:bkstg=\"http:\/\/www.atypon.com\/backstage-ns\" xmlns:urlutil=\"java:com.atypon.literatum.customization.UrlUtil\" xmlns:pxje=\"java:com.atypon.frontend.services.impl.PassportXslJavaExtentions\">\n   <head>\n      <meta http-equiv=\"Content-Type\" content=\"text\/html; charset=UTF-8\">\n      <meta http-equiv=\"Content-Style-Type\" content=\"text\/css\">\n      <style type=\"text\/css\">\n            #DLtoc {\n            font: normal 12px\/1.5em Arial, Helvetica, sans-serif;\n            }\n\n            #DLheader {\n            }\n            #DLcontent h1, #DLcontent h2,#DLcontent h3,#DLcontent h4,#DLcontent h5,#DLcontent h6{\n            text-transform: none;\n            }\n\n            #DLcontent {\n            font-size:1em;\n            }\n            #DLcontent ul{\n               margin-top:0px;\n               margin-left:0px;\n            }\n\n            .DLauthors li{\n            display: inline;\n            list-style-type: none;\n            padding-right: 5px;\n            margin-left:0;\n\n            }\n\n            .DLauthors li:after{\n            content:\",\";\n            }\n            .DLauthors li.nameList.Last:after{\n            content:\"\";\n            }\n\n            .DLabstract {\n            padding-left:40px;\n            padding-right:20px;\n            display:block;\n            }\n\n            .DLformats li{\n            display: inline;\n            list-style-type: none;\n            padding-right: 5px;\n            }\n\n            .DLformats li:after{\n            content:\",\";\n            }\n            .DLformats li.formatList.Last:after{\n            content:\"\";\n            }\n\n            .DLlogo {\n            vertical-align:middle;\n            padding-right:5px;\n            border:none;\n            }\n\n            .DLcitLink {\n            \/*margin-left:20px;*\/\n            }\n\n            .DLtitleLink {\n            \/*margin-left:20px;*\/\n            }\n\n            .DLotherLink {\n            margin-left:0px;\n            }\n\n              input[id^=\"spoiler\"]{\n display: none;\n}\ninput[id^=\"spoiler\"] + label {\n  \/*display: block;*\/\n  \/*width: 200px;\n  margin: 0 auto;*\/\n  padding: 5px 20px;\n  background: #d22c58;\n  color: #fff;\n  text-align: center;\n  cursor: pointer;\n  transition: all .6s;\n  margin-left:0;\n}\ninput[id^=\"spoiler\"]:checked + label {\n  color: #333;\n  background: #ccc;\n}\ninput[id^=\"spoiler\"] ~ .spoiler {\n  width: 100%;\n  height: 0;\n  overflow: hidden;\n  opacity: 0;\n  margin: 10px auto 0; \n  padding: 10px; \n  background: #eee;\n  border: 1px solid #ccc;\n  transition: all .6s;\n}\ninput[id^=\"spoiler\"]:checked + label + .spoiler{\n  height: auto;\n  opacity: 1;\n  padding: 10px;\n}\n\n        <\/style>\n      <title>Proceedings of TEI &#8217;21: Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction<\/title>\n   <\/head>\n   <body>\n      <div id=\"DLtoc\">\n         <div id=\"DLheader\">\n            <h1>TEI &#8217;21: Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction<\/h1><a class=\"DLcitLink\" title=\"Go to the ACM Digital Library for additional information about this proceeding\" href=\"https:\/\/dl.acm.org\/doi\/proceedings\/10.1145\/3430524\"><img decoding=\"async\" class=\"DLlogo\" alt=\"Digital Library logo\" height=\"30\" src=\"https:\/\/dl.acm.org\/specs\/products\/acm\/releasedAssets\/images\/footer-logo1.png\">\n               Full Citation in the ACM Digital Library\n               <\/a><\/div>\n         <div id=\"DLcontent\">\n            <h2>SESSION: Full Papers<\/h2>\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440620\">UltraPower: Powering Tangible &amp; Wearable Devices with Focused Ultrasound<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Rafael Morales Gonz\u00e1lez<\/li>\n               <li class=\"nameList\">Asier Marzo<\/li>\n               <li class=\"nameList\">Euan Freeman<\/li>\n               <li class=\"nameList\">William Frier<\/li>\n               <li class=\"nameList Last\">Orestis Georgiou<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler2\" \/> \n            <label for=\"spoiler2\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Wireless power transfer creates new opportunities for interaction with tangible and\n                     wearable devices, by freeing designers from the constraints of an integrated power\n                     source. We explore the use of focused ultrasound as a means of transferring power\n                     to a distal device, transforming passive props into dynamic active objects. We analyse\n                     the ability to transfer power from an ultrasound array commonly used for mid-air haptic\n                     feedback and investigate the practical challenges of ultrasonic power transfer (e.g.,\n                     receiving and rectifying energy from sound waves). We also explore the ability to\n                     power electronic components and multimodal actuators such as lights, speakers and\n                     motors. Finally, we describe exemplar wearable and tangible device prototypes that\n                     are activated by UltraPower, illustrating the potential applications of this novel\n                     technology.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440621\">Connected Layers: Evaluating Visualizations of Embodiment in Contemporary Dance Performances<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Nuno N. Correia<\/li>\n               <li class=\"nameList\">Raul Masu<\/li>\n               <li class=\"nameList\">An Hoang Dieu Pham<\/li>\n               <li class=\"nameList Last\">Jochen Feitsch<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler1\" \/> \n            <label for=\"spoiler1\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> There has been a growing interest in interactive visuals in contemporary dance performances.\n                     These visuals often rely on embodied interaction techniques, such as motion capture\n                     or biosignal sensors. However, there is a lack of research into how audience members\n                     experience these interactive visuals, and how to enhance that experience. We conducted\n                     an audience study, involving four different dance performances. Each of the performances\n                     explored a different approach for interaction involving visuals. We collected data\n                     from audience members, regarding their experience of the performances, using questionnaires\n                     and interviews. The analysis of this data allows us to identify implications for design:\n                     balancing trade-offs within a mapping clarity spectrum; connecting layers; visuals\n                     as co-creative mediator; defined territory and individuality of visuals; exploration\n                     of perspective shift and abstract fragmentation. We argue that these considerations\n                     are relevant for designers of systems visualizing embodied interaction, not just for\n                     dance, but also for other related applications. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440622\">YOU BETTA WERK: Using Wearable Technology Performance Driven Inclusive Transdisciplinary\n                  Collaboration to Facilitate Authentic Learning<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Clint Zeagler<\/li>\n               <li class=\"nameList\">Jaye Lish<\/li>\n               <li class=\"nameList\">Edie Cheezburger<\/li>\n               <li class=\"nameList\">Max Woo<\/li>\n               <li class=\"nameList\">Kathleen L Tynan<\/li>\n               <li class=\"nameList\">Elise Morton<\/li>\n               <li class=\"nameList\">Simrun Mannan<\/li>\n               <li class=\"nameList\">Eva L Christensen<\/li>\n               <li class=\"nameList\">Jordan Eggleston<\/li>\n               <li class=\"nameList\">Paige Greenfield<\/li>\n               <li class=\"nameList\">Chloe Lynne Choi<\/li>\n               <li class=\"nameList\">Axel Gustafsson<\/li>\n               <li class=\"nameList\">Jonatan Holmgren<\/li>\n               <li class=\"nameList\">Aparna Iyer<\/li>\n               <li class=\"nameList\">Michael Chi<\/li>\n               <li class=\"nameList\">Maribeth Gandy<\/li>\n               <li class=\"nameList\">Laura Levy<\/li>\n               <li class=\"nameList Last\">Jay David Bolter<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler3\" \/> \n            <label for=\"spoiler3\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Working or WERKing on a wearable technology project in a transdisciplinary group\n                     can be an effective way of learning new skills and collaboration techniques. This\n                     paper describes a case study of running a wearable technology group project within\n                     an undergraduate course entitled Wearable Technology and Society. The computational\n                     media students in the class collaborated with outside performance artists (drag queens\n                     and a street dancer) to create interactive performance garments. Design methods such\n                     as the use of boundary objects aided in communication of ideas and cooperation across\n                     disciplines and cultural barriers. The requirement that the interactive garment function\n                     appropriately in a real performance lent urgency and gravity to the experience, motivating\n                     cohesive and expedited problem solving in the transdisciplinary group. The use of\n                     these methods on a project with real world outcomes and consequences facilitated an\n                     authentic learning experience for the students involved.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440623\">VirtualWire: Supporting Rapid Prototyping with Instant Reconfigurations of Wires in\n                  Breadboarded Circuits<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Woojin Lee<\/li>\n               <li class=\"nameList\">Ramkrishna Prasad<\/li>\n               <li class=\"nameList\">Seungwoo Je<\/li>\n               <li class=\"nameList\">Yoonji Kim<\/li>\n               <li class=\"nameList\">Ian Oakley<\/li>\n               <li class=\"nameList\">Daniel Ashbrook<\/li>\n               <li class=\"nameList Last\">Andrea Bianchi<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler4\" \/> \n            <label for=\"spoiler4\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Assembling circuits is a challenging and time consuming activity for novice makers,\n                     frequently resulting in incorrect placements of wires and components into breadboards.\n                     This results in errors that are difficult to identify and debug, and delays that hinder\n                     creating, exploring or reconfiguring circuit layouts. This paper presents VirtualWire,\n                     a tool that allows users to rapidly design and modify circuits in software and have\n                     these changes instantiated in real-time as electrical connections on a physical breadboard.\n                     To achieve this, VirtualWire dynamically translates circuit design files into physical\n                     connections inside a hardware switching matrix, which handles wiring across breadboard\n                     rows and to\/from an embedded Arduino. The user can interactively test, tune, and share\n                     different circuit layouts for an Arduino shield, and once satisfied, can fabricate\n                     the circuit on a permanent substrate. Quantitative and qualitative user studies demonstrate\n                     that VirtualWire significantly reduces the time taken for (by 37%), and the number\n                     of errors made during (by 53%) circuit assembly, while also supporting users in creating\n                     readable, space-efficient and flexible layouts.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440624\">The Body Beyond Movement: (Missed) Opportunities to Engage with Contemporary Dance\n                  in HCI<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Stephan J\u00fcrgens<\/li>\n               <li class=\"nameList\">Nuno N. Correia<\/li>\n               <li class=\"nameList Last\">Raul Masu<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler5\" \/> \n            <label for=\"spoiler5\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> This paper argues that a significant paradigm change in contemporary dance can offer\n                     further opportunities for HCI researchers interested in embodied interaction and interactive\n                     system design. Based on the analysis of 42 HCI papers in our data set, resulting from\n                     searches in two computing research libraries, we suggest seven thematic categories\n                     that reflect how HCI researchers have been engaging with contemporary dance. Moreover,\n                     we propose a standardized usage of contemporary dance terminology in HCI literature,\n                     and discuss the current state of engagement with publications from the field of performance\n                     theory. We identify three opportunities for HCI, which can arise through further engagement\n                     with the knowledge produced in contemporary dance and performance: to engage with\n                     the field of embodied interaction from the perspective of performance research and\n                     theory; to employ contemporary dance methods and practices in HCI research; and to\n                     integrate contemporary dance choreographers and performers as researchers in interdisciplinary\n                     projects. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440625\">Shaping Concrete for Interaction<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Linda Hirsch<\/li>\n               <li class=\"nameList\">Beat Rossmy<\/li>\n               <li class=\"nameList Last\">Andreas Butz<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler6\" \/> \n            <label for=\"spoiler6\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Concrete is a ubiquitous material in urban environments and increasingly used by\n                     industry and the maker movement. However, there is little research about its affordance\n                     and its potential for embedded User Interfaces (UI). In our ongoing work, we investigate\n                     different manufacturing processes and design strategies to change and adapt the affordances\n                     of concrete to make it appear interactive. We tested three interface elements, a button,\n                     a scroll wheel, and a slider, with 33 participants in a lab elicitation study. Each\n                     was created in two versions following two design strategies, one with a more natural\n                     look, the other more abstract. Five participants then bodystormed ideas with the prototypes\n                     in an outdoor environment. Based on our explorations, we discuss design considerations\n                     for creating concrete interfaces including the potential of both design strategies\n                     and present different application scenarios.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440626\">Up Close &amp; Personal: Exploring User-preferred Image Schemas for Intuitive Privacy\n                  Awareness and Control<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Vikram Mehta<\/li>\n               <li class=\"nameList\">Arosha K. Bandara<\/li>\n               <li class=\"nameList\">Blaine A. Price<\/li>\n               <li class=\"nameList\">Bashar Nuseibeh<\/li>\n               <li class=\"nameList Last\">Daniel Gooch<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler7\" \/> \n            <label for=\"spoiler7\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Effective end-user privacy management in everyday ubiquitous computing environments\n                     requires giving users complex, contextual information about potential privacy breaches\n                     and enabling management of these breaches in a timely, engaging and intuitive manner.\n                     In this paper, we propose using empirically grounded image schema-based metaphors\n                     to help design these interactions. Results from our exploratory user study (N=22)\n                     demonstrate end users\u2019 preferences for changes in physical attributes and spatial\n                     properties of objects for privacy awareness. For privacy control, end users prefer\n                     to exert force and create spatial movement. The study also explores user preferences\n                     for wearable vs. ambient form-factors for managing privacy and concludes that a hybrid\n                     solution would work for more users across more contexts. We thus provide a combination\n                     of form factor preferences, and a focused set of image schemas for designers to use\n                     when designing metaphor-based tangible privacy management tools.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440627\">Scaling Data Physicalization \u2013 How Does Size Influence Experience?<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Irene L\u00f3pez Garc\u00eda<\/li>\n               <li class=\"nameList Last\">Eva Hornecker<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler8\" \/> \n            <label for=\"spoiler8\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Given the material nature of data physicalization, their creators need to make many\n                     design decisions, including material choices and scale. Our study explores the impact\n                     of scale in physicalization, motivated by the assumption that size can affect user\n                     experience. We created two different physicalizations (for the same dataset) in three\n                     sizes each, and evaluated the resulting six objects with a questionnaire approach\n                     and interviews. Our findings highlight that scale needs to be chosen wisely given\n                     its impact on representation legibility (ease of viewing and understanding) and affordances\n                     for interaction. We discuss factors to take into account when designing large-scale\n                     physicalizations and in further research on the potential role of scale in physicalizaton.\n                     In particular, we argue that for large-scale physicalizations, scale should matter\n                     and communicate meaning, for instance, supporting an intuitive understanding of magnitudes,\n                     or a specific experience. Thus, scale needs to be an explicit design decision, that\n                     interacts with other design parameters.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440628\">StringTouch &#8211; From String Instruments towards new Interface Morphologies<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Beat Rossmy<\/li>\n               <li class=\"nameList\">Sonja R\u00fcmelin<\/li>\n               <li class=\"nameList Last\">Alexander Wiethoff<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler9\" \/> \n            <label for=\"spoiler9\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> We present StringTouch, a user interface design exploration translating the expressive\n                     resource of string instruments to a new interface morphology. StringTouch transfers\n                     the string as a tactile element of interaction to the touch surface, resulting in\n                     a tactilely experienceable interface. In this paper we discuss our research through\n                     design centered approach, which focused on the exploration of musical string instruments\n                     and their translation to the UI context. To investigate this specific design space,\n                     we analyzed the systematic and handling of string instruments as well as common HCI\n                     principles to develop the interaction concept. The resulting experience prototype\n                     demonstrates the idea\u2019s potential for haptic UI design and provides insights into\n                     the prototyping process. We present: (1)&nbsp;the investigation of string instruments as\n                     a resource for TUI design and (2)&nbsp;the transfer to a generic UI context to inform new\n                     hybrid interface morphologies that combine features of tangible, touch, and flexible\n                     interaction.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440629\">Knit Stretch Sensor Placement for Body Movement Sensing<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">An Liang<\/li>\n               <li class=\"nameList\">Rebecca Stewart<\/li>\n               <li class=\"nameList\">Rachel Freire<\/li>\n               <li class=\"nameList Last\">Nick Bryan-Kinns<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler10\" \/> \n            <label for=\"spoiler10\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Motion capture technology is widely used in movement-related Human-Computer Interaction,\n                     especially in digital arts such as digital dance performance. This paper presents\n                     a knit stretch sensor-based dance leotard design to evaluate the locations where the\n                     sensors best capture the movement on the body. Two studies are undertaken: (1) interviews\n                     to determine user requirements of a dance movement sensing system; (2) evaluation\n                     of sensor placement on the body. Ten interviewees including dancers, choreographers,\n                     and technologists describe their requirements and expectations for a body movement\n                     sensing system. The centre of the body (the torso) is determined to be the area of\n                     primary interest for dancers and choreographers to sense movement, and technologists\n                     find the robustness of textile sensors the most challenging for textile sensing system\n                     design. A dance leotard toile is then designed with sensor groupings on the torso\n                     along the direction of major muscles, based on the interviewees\u2019 preferred movements\n                     to be captured. Each group of the sensors are evaluated by comparing their signal\n                     output and a Vicon motion capture system. The evaluation shows sensors which are constantly\n                     under tension perform better. For example, sensors on the upper back have a higher\n                     success rate than the sensors on the lower back. The dance leotard design was found\n                     to capture the movements of standing lean back and standing waist twists the best.\n                     <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440630\">Soft Speakers: Digital Embroidering of DIY Customizable Fabric Actuators<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Sara Nabil<\/li>\n               <li class=\"nameList\">Lee Jones<\/li>\n               <li class=\"nameList Last\">Audrey Girouard<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler11\" \/> \n            <label for=\"spoiler11\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> We introduce Soft Speakers, a systematic approach for designing custom fabric actuators\n                     that can be used as audio speakers and vibro-haptic actuators. Digitally-embroidered\n                     with e-textiles, we implement Soft Speakers as tactile, malleable and aesthetic designs\n                     to be part of wearables, soft furnishing and fabric objects. We present a rapid technique\n                     for the DIY fabrication of audio feedback into soft interfaces. We also discuss and\n                     evaluate 7 factors for their parametric design in additive and constructive methods.\n                     To demonstrate the feasibility of our approach and the breadth of new designs that\n                     it enables, we developed 5 prototypes: 3 wearables, a piece of furniture and a soft\n                     toy. Studying Soft Speakers with maker-users expanded the design space, empowering\n                     users and supporting inclusive design. Our study includes insights on user experience\n                     of real-world interactive applications for remote communication, e-learning, entertainment,\n                     navigation and gaming, enabled by Soft Speakers\u2019 customizable and scalable form factor.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440631\">Exploring Tangible Algorithmic Imaginaries in Movie Recommendations<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Oscar Alvarado<\/li>\n               <li class=\"nameList\">Vero Vanden Abeele<\/li>\n               <li class=\"nameList\">David Geerts<\/li>\n               <li class=\"nameList\">Francisco Guti\u00e9rrez<\/li>\n               <li class=\"nameList Last\">Katrien Verbert<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler12\" \/> \n            <label for=\"spoiler12\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Recommender algorithms play an active role in many everyday activities. However,\n                     personalized recommendations often produce negative experiences due to a lack of awareness,\n                     control, or transparency. Allowing users to materialize their algorithmic imaginaries\n                     exposes how they experience, perceive, and imagine recommender algorithms. Moreover,\n                     it can unearth novel and previously unattended design opportunities for tangible interactions\n                     with algorithms. Therefore, we explored how 15 users of a famous movie recommender\n                     system materialized tangible designs to reflect and discuss their algorithmic imaginaries\n                     during co-design workshops and interviews. Using thematic analysis, we identified\n                     two forms of algorithmic imaginaries that can inspire tangible interactions with recommender\n                     algorithms: metaphoric and datafied representations. Complementary themes exposed\n                     the influence of contextual factors and diverse negative attitudes towards personalized\n                     movie recommendations. Based on these findings, we suggest design opportunities and\n                     suggestions for improving the algorithmic experience of movie recommendations and\n                     similar systems through tangible user interfaces.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440632\">Electrolysis Bubble Display based Art Installations<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Ayaka Ishii<\/li>\n               <li class=\"nameList\">Manaka Fukushima<\/li>\n               <li class=\"nameList\">Namiki Tanaka<\/li>\n               <li class=\"nameList\">Yasushi Matoba<\/li>\n               <li class=\"nameList\">Kaori Ikematsu<\/li>\n               <li class=\"nameList Last\">Itiro Siio<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler13\" \/> \n            <label for=\"spoiler13\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Research was conducted on a digital information display using electrolysis bubbles.\n                     Although the research mainly focused on information displays in daily life, the ephemerality\n                     of bubbles is also a promising method for dynamic art installations. In this paper,\n                     we present two novel artworks using this electrolysis bubble display mechanism. First,\n                     we present \u201cUTAKATA,\u201d a ticker-like bubble display, using a running-water channel.\n                     Seven electrodes are placed linearly on the channel bed, and they generate text messages\n                     using bubble dots that drift toward the lower end of the channel. Second, we present\n                     the \u201cBubble Mirror,\u201d which is a water pan with a camera that captures a visitor\u2019s\n                     face and displays it using electrolysis bubbles as pixels. Facial images with six\n                     levels of grayscales are displayed on the water surface using 32 \u00d7 32 electrodes.\n                     We evaluated the output properties of these configurations and discuss the results\n                     obtained.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440633\">Shape Changing Fabric Samples for Interactive Fashion Design<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Daniela Ghanbari Vahid<\/li>\n               <li class=\"nameList\">Lee Jones<\/li>\n               <li class=\"nameList\">Audrey Girouard<\/li>\n               <li class=\"nameList Last\">Lois Frankel<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler14\" \/> \n            <label for=\"spoiler14\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> As technology is integrated into all aspects of our lives, researchers are exploring\n                     emerging technologies in the field of fashion design. The substantial growth in the\n                     field of functional apparel design, such as the use of smart textiles, encourages\n                     researchers and fashion designers to incorporate technology within their designs.\n                     In much previous work, e-textiles have required interdisciplinary knowledge such as\n                     electrical engineering and computer science to be successful. To help with this we\n                     created ready-to-use shape changing fabric samples for fashion designers. We explored\n                     this research gap through a preliminary user case study with seven experienced designers.\n                     Our results suggest design approaches for shape changing fabric samples that would\n                     assist non-technically skilled designers incorporating technology in their designs.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440634\">OmniSoft: A Design Tool for Soft Objects by Example<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Jeeeun Kim<\/li>\n               <li class=\"nameList\">Qingnan Zhou<\/li>\n               <li class=\"nameList\">Amanda Ghassaei<\/li>\n               <li class=\"nameList Last\">Xiang &#8216;Anthony&#8217; Chen<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler15\" \/> \n            <label for=\"spoiler15\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Softness is one of the most important factors in human tactile perception. With recent\n                     advances in 3Dprinting, there has been significant progress in fabricating compliant\n                     objects. However, existing methods typically remain inaccessible to end-users, mainly\n                     due to the separation between designing shapes and setting printing parameters to\n                     achieve desired softness, resulting in the exclusion of its customization in early\n                     design processes. In this work, we contribute an end-to-end design tool that takes\n                     a design-by-example approach: given a 3D model, a user can specify the region of interest\n                     and a level of softness, by shopping everyday objects as a reference. The tool then\n                     generates both geometry and 3D printing parameters to reproduce the desired softness,\n                     which can be fabricated using low-cost FDM 3D printing and materials for it. We also\n                     provide a data-driven pipeline to enable other compliance modeling methods to be generalized\n                     within our design tool. In two user studies, we demonstrated that users could easily\n                     locate existing reference objects\u2019 softness to a 3D printed object. In a design session,\n                     end-users successfully used OmniSoft to design augmented functions. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440635\">Playing by Ear: Designing for the Physical in a Sound-Based Virtual Reality Narrative<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Daniel Harley<\/li>\n               <li class=\"nameList\">Aneesh P. Tarun<\/li>\n               <li class=\"nameList\">Bonnie J. Stinson<\/li>\n               <li class=\"nameList\">Tudor Tibu<\/li>\n               <li class=\"nameList Last\">Ali Mazalek<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler16\" \/> \n            <label for=\"spoiler16\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>We present two proof-of-concept narrative VR experiences with a focus on sound-based\n                     physical interactions. Responding to a call to expand upon current design conceptualizations,\n                     we draw on tangible sound-based design in order to develop considerations for the\n                     body and physical environments within VR narratives. We propose that a focus on the\n                     actions the player is asked to perform (e.g., touch, stand, kneel, grasp, walk, listen,\n                     reach, dance) can contribute to an understanding of VR as a sensory, embodied medium\n                     that offers ways to playfully engage with physical reality rather than simulate it\n                     entirely.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440636\">Can Physical Tools that Adapt their Shape based on a Learner\u2019s Performance Help in\n                  Motor Skill Training?<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Dishita G Turakhia<\/li>\n               <li class=\"nameList\">Yini Qi<\/li>\n               <li class=\"nameList\">Lotta-Gili Blumberg<\/li>\n               <li class=\"nameList\">Andrew Wong<\/li>\n               <li class=\"nameList Last\">Stefanie Mueller<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler17\" \/> \n            <label for=\"spoiler17\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Adaptive tools that can change their shape to support users with motor tasks have\n                     been used in a variety of applications, such as to improve ergonomics and support\n                     muscle memory. In this paper, we investigate whether shape-adapting tools can also\n                     help in motor skill training. In contrast to static training tools that maintain task\n                     difficulty at a fixed level during training, shape-adapting tools can vary task difficulty\n                     and thus keep learners\u2019 training at the optimal challenge point, where the task is\n                     neither too easy, nor too difficult. <\/p> \n                  <p>To investigate whether shape adaptation helps in motor skill training, we built a\n                     study prototype in the form of an adaptive basketball stand that works in three conditions:\n                     (1) static, (2) manually adaptive, and (3) auto-adaptive. For the auto-adaptive condition,\n                     the tool adapts to train learners at the optimal challenge point where the task is\n                     neither too easy nor too difficult. Results from our two user studies show that training\n                     in the auto-adaptive condition leads to statistically significant learning gains when\n                     compared to the static (F1, 11 = 1.856, p &lt; 0.05) and manually adaptive conditions\n                     (F1, 11 = 2.386, p &lt; 0.05). <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440637\">Understanding the First Person Experience of Walking Mindfulness Meditation Facilitated\n                  by EEG Modulated Interactive Soundscape<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Karen Cochrane<\/li>\n               <li class=\"nameList\">Lian Loke<\/li>\n               <li class=\"nameList\">Matthew Leete<\/li>\n               <li class=\"nameList\">Andrew Campbell<\/li>\n               <li class=\"nameList Last\">Naseem Ahmadpour<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler18\" \/> \n            <label for=\"spoiler18\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Walking meditation is a form of mindfulness training, where the act of walking provides\n                     a rhythmic meter for attentional focus. Whilst digital technologies to support sitting\n                     meditation and walking practices exist, less explored is the first person in-the-moment\n                     experience of technology-mediated walking meditation. We present a study of group\n                     walking meditation, with and without an interactive rhythmic soundscape modulated\n                     by one practitioner\u2019s brainwave data. Six workshops were conducted with novice and\n                     advanced practitioners, involving a guided walking meditation with body scan, writing\n                     and drawing exercises and a group interview. The analysis yielded themes of shifting\n                     state, attention, self-regulation strategy, and immersion and reflection, and insights\n                     into how practitioners use sound to synchronize both walking and breathing. We contribute\n                     a method for eliciting, and a novel description of, the first person experience of\n                     walking meditation, as resources for the design of interactive technologies to support\n                     mindfulness practices of walking meditation.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440638\">The Machine Learning Machine: A Tangible User Interface for Teaching Machine Learning<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Magnus H\u00f8holt Kaspersen<\/li>\n               <li class=\"nameList\">Karl-Emil Kj\u00e6r Bilstrup<\/li>\n               <li class=\"nameList Last\">Marianne Graves Petersen<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler19\" \/> \n            <label for=\"spoiler19\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Machine Learning (ML) is often used invisibly in everyday applications with little\n                     opportunity for consumers to investigate how it works. In this paper, we expand recent\n                     efforts to unfold what students should know about ML and how to design tools and activities\n                     allowing them to engage with ML. To do so, we explore how to make processes and aspects\n                     of ML tangible through the design of the Machine Learning Machine (MLM); a tangible\n                     user interface which enables students to create their own data-sets using pen and\n                     paper and to iteratively build and test ML models using this data. Based on insights\n                     from the design process and a preliminary pilot study with the MLM, we discuss how\n                     a tangible approach to engaging with ML can spur curiosity in students and how the\n                     iterative process of improving ML models can encourage students to reflect on the\n                     relation between data, model and predictions. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440639\">Scaffolding shared imagination with tangible design<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Maarten L. Smith<\/li>\n               <li class=\"nameList\">Sander van der Zwan<\/li>\n               <li class=\"nameList\">Jelle P. Bruineberg<\/li>\n               <li class=\"nameList\">Pierre D. L\u00e9vy<\/li>\n               <li class=\"nameList Last\">Caroline C. M. Hummels<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler20\" \/> \n            <label for=\"spoiler20\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>We follow up on a prominent line of work in which principles of embodied cognition\n                     are employed to not only account for skilled coping but also for more intellectual\n                     activities such as remembering and imagination. Imagination then, is not a reflective\n                     activity an individual does by herself, but a shared and embodied activity scaffolded\n                     by tangible design. We present a case study in which we designed a toolkit to facilitate\n                     imagining the Netherlands in 2050. We wrote speculative stories of people living in\n                     2050 and designed an assortment of objects. We held several workshops to use the toolkit\n                     for shared imagination for our client, Rijkswaterstaat. We analyze how, in the context\n                     of the workshops, the stories and objects provided affordances for shared imagination.\n                     We thereby hope to have demonstrated that it is possible to design for more intellectual\n                     activities in a tangible and embodied way.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440640\">Punch-Sketching E-textiles: Exploring Punch Needle as a Technique for Sustainable, Accessible, and Iterative Physical\n                  Prototyping with E-textiles<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Lee Jones<\/li>\n               <li class=\"nameList\">Miriam Sturdee<\/li>\n               <li class=\"nameList\">Sara Nabil<\/li>\n               <li class=\"nameList Last\">Audrey Girouard<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler21\" \/> \n            <label for=\"spoiler21\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Tangible toolkits enable individuals to explore concepts through combining components\n                     together and taking them apart. The strength and limitation of many e-textile toolkits\n                     is that threads hold them in place, and once put together they need destructive methods\n                     to take them apart. In this paper, we propose Punch-Sketching e-textiles, a drawing\n                     technique that uses a punch needle to iteratively prototype soft circuits. The benefits\n                     of this approach is sustainability and reusability where users can easily pull out\n                     circuits without damaging the materials or creating waste, while also testing out\n                     concepts using the actual threads that will be used in the final prototype. To validate\n                     our technique, we ran three studies comparing sewing and punching e-textiles through:\n                     1) Understanding the process with two fiber artists; 2) Exploring the potential with\n                     four beginner users; and 3) Utilizing our methods further with 10 occupational therapists.\n                     Insights from these three studies include when and how to use each method, toolkit\n                     recommendations, considerations for iterative physical prototyping, sustainability,\n                     and accessibility.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440641\">Surface Electromyography for Sensing Performance Intention and Musical Imagery in\n                  Vocalists<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Courtney N. Reed<\/li>\n               <li class=\"nameList Last\">Andrew P. McPherson<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler22\" \/> \n            <label for=\"spoiler22\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Through experience, the techniques used by professional vocalists become highly ingrained\n                     and much of the fine muscular control needed for healthy singing is executed using\n                     well-refined mental imagery. In this paper, we provide a method for observing intention\n                     and embodied practice using surface electromyography (sEMG) to detect muscular activation,\n                     in particular with the laryngeal muscles. Through sensing the electrical neural impulses\n                     causing muscular contraction, sEMG provides a unique measurement of user intention,\n                     where other sensors reflect the results of movement. In this way, we are able to measure\n                     movement in preparation, vocalised singing, and in the use of imagery during mental\n                     rehearsal where no sound is produced. We present a circuit developed for use with\n                     the low voltage activations of the laryngeal muscles; in sonification of these activations,\n                     we further provide feedback for vocalists to investigate and experiment with their\n                     own intuitive movements and intentions for creative vocal practice.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440642\">The ThreadBoard: Designing an E-Textile Rapid Prototyping Board<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Chris Hill<\/li>\n               <li class=\"nameList\">Michael Schneider<\/li>\n               <li class=\"nameList\">Ann Eisenberg<\/li>\n               <li class=\"nameList Last\">Mark D. Gross<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler23\" \/> \n            <label for=\"spoiler23\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> E-textiles, which embed circuitry into textile fabrics, blend art and creative expression\n                     with engineering, making it a popular choice for STEAM classrooms [6, 12]. Currently,\n                     e-textile development relies on tools intended for traditional embedded systems, which\n                     utilize printed circuit boards and insulated wires. These tools do not translate well\n                     to e-textiles, which utilize fabric and uninsulated conductive thread. This mismatch\n                     of tools and materials can lead to an overly complicated development process for novices.\n                     In particular, rapid prototyping tools for traditional embedded systems are poorly\n                     matched for e-textile prototyping. This paper presents the ThreadBoard, a tool that\n                     supports rapid prototyping of e-textile circuits. With rapid prototyping, students\n                     can test circuit designs and identify circuitry errors prior to their sewn project.\n                     We present the design process used to iteratively create the ThreadBoard\u2019s layout,\n                     with the goal of improving its usability for e-textile creators.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440643\">PPCards: Toward Enhancing Electronic Prototyping with Editions of a Card-based Platform<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Alexandre Gomes de Siqueira<\/li>\n               <li class=\"nameList\">Ayush Bhargava<\/li>\n               <li class=\"nameList\">Rohith Venkatakrishnan<\/li>\n               <li class=\"nameList Last\">Roshan Venkatakrishnan<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler24\" \/> \n            <label for=\"spoiler24\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Prototyping electronic circuits is often facilitated by web-based tutorials and breadboards.\n                     Several virtual and hybrid platforms do exist, each carrying their own limitations.\n                     Some of these platforms fall prey to split-attention effects, wherein users are required\n                     to split their attention to integrate multiple sources of spatially separated information.\n                     This hinders the learning and prototyping processes. Other platforms provide a single\n                     source of information, but lack tangible interaction with electronic components or\n                     suffer from the absence of active feedback which can also hinder these processes.\n                     There is hence a need for prototyping platforms that mitigate split attention effects,\n                     while continuing to provide other desirable aspects such as tangible interaction.\n                     To address this, we present three editions of PPCards, a card-based platform for prototyping\n                     electronic circuits, towards overcoming limitations of existing paradigms. Through\n                     a comparative study, it was determined that the first edition of PPCards outdid the\n                     conventional breadboard web-based tutorial paradigm in aspects of split attention,\n                     usability, and user experience. The second and third editions build upon successful\n                     characteristics of the first, additionally provisioning support for multimedia content\n                     and real-time feedback during the prototyping process. Based on quantitative data\n                     and qualitative feedback, we go on to discuss design considerations for future tangible\n                     card-based tools. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440644\">Evoking Empathy: A Framework for Describing Empathy Tools<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Sydney Pratte<\/li>\n               <li class=\"nameList\">Anthony Tang<\/li>\n               <li class=\"nameList Last\">Lora Oehlberg<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler25\" \/> \n            <label for=\"spoiler25\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Empathy tools are experiences designed to evoke empathetic responses by placing the\n                     user in another\u2019s lived and felt experience. The problem is that designers do not\n                     have a common vocabulary to describe empathy tool experiences; consequently, it is\n                     difficult to compare\/contrast empathy tool designs or to think about their efficacy.\n                     To address this problem, we analyzed 26 publications on empathy tools to develop a\n                     descriptive framework for designers of empathy tools. Based on our analysis, we found\n                     that empathy tools can be described along three dimensions: (i) the amount of agency\n                     the tool allows, (ii) the user\u2019s perspective while using the tool, and (iii) the type\n                     of sensations that are experienced. We show that this framework can be used to describe\n                     a wide variety of empathy tools and provide recommendations for empathy tool designers,\n                     as well as techniques for measuring the efficacy of an empathy tool experience. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440645\">Seedmarkers: Embeddable Markers for Physical Objects<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Christopher Getschmann<\/li>\n               <li class=\"nameList Last\">Florian Echtler<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler26\" \/> \n            <label for=\"spoiler26\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>We present Seedmarkers, shape-independent topological markers that can be embedded\n                     in physical objects manufactured with common rapid-prototyping techniques. Many markers\n                     are optimized for technical performance while visual appearance or the feasibility\n                     of permanently merging marker and physical object is not considered. We give an overview\n                     of the aesthetic properties of a wide range of existing markers and conducted a short\n                     online survey to assess the perception of popular marker designs. Based on our findings\n                     we introduce our generation algorithm making use of weighted Voronoi diagrams for\n                     topological optimization. With our generator, Seedmarkers can be created from technical\n                     drawings during the design process to fill arbitrary shapes on any surface. Given\n                     dimensions and manufacturing constraints, different configurations for 3 or 6 degrees\n                     of freedom tracking are possible. We propose a set of application examples for shape-independent\n                     markers, including 3D printed tangibles, laser cut plates and functional markers on\n                     printed circuit boards. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440646\">\u201cCan you help me move this over there?\u201d: training children with ASD to joint action\n                  through tangible interaction and virtual agent<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Tom Giraud<\/li>\n               <li class=\"nameList\">Brian Ravenet<\/li>\n               <li class=\"nameList\">Chi Tai Dang<\/li>\n               <li class=\"nameList\">Jacqueline Nadel<\/li>\n               <li class=\"nameList\">Elise Prigent<\/li>\n               <li class=\"nameList\">Gael Poli<\/li>\n               <li class=\"nameList\">Elisabeth Andre<\/li>\n               <li class=\"nameList Last\">Jean-claude Martin<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler27\" \/> \n            <label for=\"spoiler27\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>New technologies for autism focus on the training of either social skills or motor\n                     skills, but not both. Such a dichotomy omits a wide range of joint action tasks that\n                     require the coordination of two persons (e.g. moving a heavy furniture). The training\n                     of these physical tasks performed in dyad has great potential to foster inclusiveness\n                     while having an impact on both social and motor skills. In this paper, we present\n                     the design of a tangible and virtual interactive system for the training of children\n                     with Autism Spectrum Disorder (ASD) in performing joint actions. The proposed system\n                     is composed of a virtual character projected onto a surface on which a tangible object\n                     is magnetized: both the user and the virtual character hold the object, thus simulating\n                     a joint action. We report and discuss preliminary results of a field training study,\n                     which shows the potential of the interactive system.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440647\">Using Robotics and A.I. to Physically Explore a Space of Aesthetic Possibilities: Defining a Physical Aesthetic Experience by the Targeted EEG Feedback of the Perceiver<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Emanuel Gollob<\/li>\n               <li class=\"nameList\">Magdalena Mayer<\/li>\n               <li class=\"nameList Last\">Johannes Braumann<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler28\" \/> \n            <label for=\"spoiler28\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Aesthetic perception and cognition processes are highly individual dynamic processes\n                     due to their dependency on the emotional affective state, perceptual analysis, memory,\n                     context, and cognitive mastering. Therefore, an aesthetic experience will always be\n                     perceived differently for every person and time. With our approach, we research the\n                     potential to define an adaptive physical aesthetic experience by the targeted EEG\n                     feedback of the perceiver. <\/p> \n                  <p>In a series of three distinct projects, we use generative robotic control (KUKA|prc),\n                     a Deep Convolutional Generative Adversarial Network, and electroencephalography (EEG)\n                     to create an aesthetic adaptation strategy within a physical parametric output space.\n                     <\/p> \n                  <p>Though certain physical limitations apply, the resulting artefact interaction offers\n                     the potential to make the aesthetic definition to a certain extend relational to individual\n                     perception and cognition processes and, therefore, to some extent adaptive to emotional,\n                     contextual, and cultural change over time.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440648\">Art Digital Jewellery: Practitioners\u2019 Perspectives<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Nantia Koulidou<\/li>\n               <li class=\"nameList Last\">Robb Mitchell<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler29\" \/> \n            <label for=\"spoiler29\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>We introduce the term &#8216;Art Digital Jewellery&#8217; as a label for craft-oriented, bespoke\n                     approaches to embedding electronics in jewellery. These unconventional digital-physical\n                     jewellery practices struggle for attention compared with higher profile, often more\n                     mass-production oriented wearables. This is partly because discourses articulating\n                     and critiquing these experimental practices are scarce and obscure to HCI researchers.\n                     To address this, we describe how these artistic practices arose from earlier fashion\n                     movements and we engaged six leading creative practitioners in a structured and iterative\n                     dialogue. Analysis of our adapted Delphi survey suggests that core to Art Digital\n                     Jewellery is very individualised design processes and creating artefacts which are\n                     highly personal in terms of their form, their materials, their narratives and their\n                     interactivity. An appreciation of these unique practices may enrich perspectives on\n                     designing wearables, marrying craft with technology, and personalisation of experiences.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440649\">Tactile Human-Quadrotor Interaction: MetroDrone<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Marc Lieser<\/li>\n               <li class=\"nameList\">Ulrich Schwanecke<\/li>\n               <li class=\"nameList Last\">J\u00f6rg Berdux<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler30\" \/> \n            <label for=\"spoiler30\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Aerial robots such as quadrotors enjoy ever-increasing popularity and emerge in everyday\n                     applications that require user interaction. At immediate proximity, physical control\n                     of the quadrotor by touch may be desired or even necessary. In this paper we present\n                     a tactile 3D touch interaction scenario with a quadrotor by introducing virtual buttons\n                     whose operation is detected in the accelerometer data of the built-in Inertial Measurement\n                     Unit (IMU) of the quadrotor. By dispensing with additional sensors, we are able to\n                     keep the size of the used quadrotor to a minimum and thus address the problem of users\n                     being discouraged from interaction with quadrotors at immediate proximity. As an example\n                     for the proposed interaction scenario, we introduce MetroDrone, a quadrotor responding\n                     to repeated user taps to virtually defined buttons by flying trajectories according\n                     to the beat and operated button. This introduces a minimalist interaction technique\n                     that requires no intermediary devices and strengthens human-robot connections through\n                     shared musical experience.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440650\">Flexi Card Game: A Design Toolkit for Unconventional Communication Systems for Long-Distance\n                  Relationships<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Hong Li<\/li>\n               <li class=\"nameList\">Awais Hameed Khan<\/li>\n               <li class=\"nameList\">Kuisma Martin Hurtig<\/li>\n               <li class=\"nameList\">Pradthana Jarusriboonchai<\/li>\n               <li class=\"nameList Last\">Jonna H\u00e4kkil\u00e4<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler31\" \/> \n            <label for=\"spoiler31\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Mainstream communication technology today enables us to connect with loved ones across\n                     long distances. However, even with the proliferation of easy-to-use technology, there\n                     are reports on increasing loneliness and isolation amongst people. The technologies\n                     that enable these interactions are not necessarily designed to support meaningful\n                     emotional communication for users in long-distance relationships (LDRs). Furthermore,\n                     there is a lack of participatory tools that are designed with the focus of supporting\n                     end-user involvement in the design of LDR communication systems. We developed the\n                     Flexi Card Game (FCG), a card-based generative design toolkit to support designers\n                     and non-designers within participatory structures that can help develop unconventional\n                     communication systems to support LDRs. The FCG was developed using an iterative design\n                     process, involving end-users, designers, and researchers across five workshops with\n                     56 participants. The paper makes three main contributions: (1) presents FCG\u2014the novel\n                     card-based design toolkit itself, (2) describes the process of developing an LDR framework\n                     into a participatory toolkit, and (3) offers lessons and insights that can help researchers\n                     who are developing participatory tools in similar contexts. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440651\">The Bodies of TEI \u2013 Investigating Norms and Assumptions in the Design of Embodied\n                  Interaction<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Katta Spiel<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler32\" \/> \n            <label for=\"spoiler32\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> In the few decades since the first mainframe computers, computing technologies have\n                     grown smaller, and more pervasive, moving onto and even inside human bodies. Even\n                     as those bodies have received increased attention by scholars, designers, and technologists,\n                     the bodily expectations and understandings articulated by these technological artefacts\n                     have not been a focus of inquiry in the field. I conducted a feminist content analysis\n                     on select papers in the proceeding of the ACM International Conference on Tangible,\n                     Embedded and Embodied Interaction (TEI) since its inception in 2007. My analysis illustrates\n                     how artefacts are implicitly oriented on unmarked bodily norms, while technologies\n                     designed for non-normative bodies treat those as deviant and in need of correction.\n                     Subsequently, I derive a range of provocations focused on material bodies in embodied\n                     interaction which offer a point of reflection and identify potentials for future work\n                     in the field.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440652\">Skill-Sleeves: Designing Electrode Garments for Wearability<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Jarrod Knibbe<\/li>\n               <li class=\"nameList\">Rachel Freire<\/li>\n               <li class=\"nameList\">Marion Koelle<\/li>\n               <li class=\"nameList Last\">Paul Strohmeier<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler33\" \/> \n            <label for=\"spoiler33\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Many existing explorations of wearables for HCI consider functionality first and\n                     wearability second. Typically, as the technologies, designs, and experiential understandings\n                     develop, attention can shift towards questions of deployment and wearability. To support\n                     this shift of focus we present a case study of the iterative design of electrode sleeves.\n                     We consider the design motivations and background that led to the existing, prototype\n                     EMS sleeves, and the resultant challenges around their wearability. Through our own\n                     design research practice, we seek to reveal design criteria towards the wearability\n                     of such a sleeve, and provide designs that optimise for those criteria. We contribute\n                     (1) new electrode sleeve designs, which begin to make it practicable to take EMS beyond\n                     the lab, (2) new fabrication processes that support rapid production and personalisation,\n                     and (3) reflections on criteria for wearability across new eTextile garments. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440653\">Boiling Mind: Amplifying the Audience-Performer Connection through Sonification and\n                  Visualization of Heart and Electrodermal Activities<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Moe Sugawa<\/li>\n               <li class=\"nameList\">Taichi Furukawa<\/li>\n               <li class=\"nameList\">George Chernyshov<\/li>\n               <li class=\"nameList\">Danny Hynds<\/li>\n               <li class=\"nameList\">Jiawen Han<\/li>\n               <li class=\"nameList\">Marcelo Padovani<\/li>\n               <li class=\"nameList\">Dingding Zheng<\/li>\n               <li class=\"nameList\">Karola Marky<\/li>\n               <li class=\"nameList\">Kai Kunze<\/li>\n               <li class=\"nameList Last\">Kouta Minamizawa<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler34\" \/> \n            <label for=\"spoiler34\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> In stage performances, an invisible wall in front of the stage often weakens the\n                     connections between the audience and performers. To amplify this performative connection,\n                     we present the concept \u201dBoiling Mind\u201d. Our design concept is based on streaming sensor\n                     data related to heart and electrodermal activities from audience members and integrating\n                     this data into staging elements, such as visual projections, music, and lighting.\n                     Thus, the internal states of the audience directly influence the staging. Artists\n                     can have a more direct perception of the inner reactions of audience members and can\n                     create physical expressions in response to them. In this paper, we present the wearable\n                     sensing system as well as design considerations of mapping heart and electrodermal\n                     activity to changes in the staging elements. We evaluated our design and setup over\n                     three live performances. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440654\">Flowcuits: Crafting Tangible and Interactive Electrical Components with Liquid Metal\n                  Circuits<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Yutaka Tokuda<\/li>\n               <li class=\"nameList\">Deepak Ranjan Sahoo<\/li>\n               <li class=\"nameList\">Matt Jones<\/li>\n               <li class=\"nameList\">Sriram Subramanian<\/li>\n               <li class=\"nameList Last\">Anusha Withana<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler35\" \/> \n            <label for=\"spoiler35\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> We present Flowcuits, a DIY fabrication method to prototype tangible, interactive\n                     and functional electrical components by manipulating liquid metals. The prototypes\n                     afford both physical and visual interactions to demonstrate the inner working mechanics\n                     of fundamental electronic elements, which enables tangible and playful learning. The\n                     fabrication process follows simple imprinting and sealing of fluidic circuits with\n                     a 3D-printed stamp on an accessible moldable-substrates such as \u2018Blu Tack\u2019. Utilizing\n                     conductive gallium indium liquid metal, we demonstrated interactive and re-configurable\n                     electronic components such as switches, variable resistors, variable capacitors, logic\n                     gates and pressure sensors. In this paper, we present the design analogy of Flowcuits,\n                     DIY fabrication approach including a parametric 3D stamp design toolkit and results\n                     from a technical evaluation. The stamps are printed with a low-cost 3D printer and\n                     all the materials are inexpensive and reusable, enabling Flowcuits to be easily used\n                     without any advanced lab facilities. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440655\">Expanded Boundaries: Art and Non-human Essence as Playful Inspirations for Children&#8217;s Prosthetics<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Juliana Maria Moreira Soares<\/li>\n               <li class=\"nameList\">Jo\u00e3o Pedro Farinha Nunes da Costa<\/li>\n               <li class=\"nameList\">Pedro \u00c2ngelo<\/li>\n               <li class=\"nameList Last\">M\u00f3nica Mendes<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler36\" \/> \n            <label for=\"spoiler36\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>There is an important gap in Assistive Technology related to the multiple capabilities\n                     of exploratory creativity, as with multi-aesthetic and inter-mediatic surfaces, especially\n                     in the design and development of child prosthetics. This in-progress research explores\n                     creative interactions and combinations of art and digital fabrication to design playful\n                     prosthetics for children&#8217;s upper limbs. In order to do so, we are using feline animal\n                     life as inspiration and as a ludic premise to engagement through playfulness. With\n                     this artistic exploration in the scope of tangible and embodied interfaces, we seek\n                     to contribute to improve and incorporate engagement and fantasy through playful interfaces,\n                     ultimately transforming a perceived difference into empowerment of a child&#8217;s self-esteem,\n                     awareness and imagination.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440656\">Dothraki: Tracking Tangibles Atop Tabletops Through De-Bruijn Tori<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Dennis Sch\u00fcsselbauer<\/li>\n               <li class=\"nameList\">Andreas Schmid<\/li>\n               <li class=\"nameList Last\">Raphael Wimmer<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler37\" \/> \n            <label for=\"spoiler37\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Tangibles are small, graspable objects that act as input devices or physical representations\n                     of digital data. Oftentimes, it is desirable to track the position of tangibles on\n                     a surface and their relation to each other. However, outside-in tracking techniques\n                     &#8211; such as capacitive touchscreens or cameras &#8211; require setting up elaborate infrastructure\n                     and are prone to occlusion or interference. We propose Dothraki, an inside-out tracking\n                     technique for tangibles on flat surfaces. An optical mouse sensor embedded in the\n                     tangible captures a small (36\u00d736 pixel \/ 1\u00d71 mm), unique section of a black-and-white\n                     De-Bruijn dot pattern printed on the surface. Our system efficiently searches the\n                     pattern space in order to determine the precise location of the tangible with sub-millimeter\n                     accuracy. Our proof-of-concept implementation offers a recognition rate of up to 95%,\n                     robust error detection, an update rate of 14 Hz, and a low-latency relative tracking\n                     mode.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440657\">Arpilleras Parlantes: Designing Educational Material for the Creation of Interactive\n                  Textile Art Based on a Traditional Chilean Craft.<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Sofia Guridi<\/li>\n               <li class=\"nameList\">Tamara Vicencio<\/li>\n               <li class=\"nameList Last\">Rodrigo Gajardo<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler38\" \/> \n            <label for=\"spoiler38\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Electronic Textiles (eTextiles) and open source material has been entangled with\n                     \u201cDIY\u201d movements and hands on work to create new interactive interfaces. Studies have\n                     shown the potential these have to bring computational knowledge closer to new audiences.\n                     We address how eTextiles can approach a traditional textile art, specifically \u201cArpilleras\u201d\n                     from Chile, to understand how each field contributes from a symbolic, material and\n                     technical perspective. Following participatory approach methods and STEAM guidelines\n                     to create an educational program, we observed how eTextiles can influence the creation\n                     of soft interactive interfaces that enhance the communicative character and cultural\n                     heritage of a craft, and the potential of their use in a pedagogical context by using\n                     a specially designed kit. Our study suggests that connecting technology with strong\n                     cultural identity craft, can help to reach new audiences, revitalize the traditional\n                     technique, and create new tools for expression and creativity. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440658\">Real-Time Capture of Holistic Tangible Interactions<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Virag Varga<\/li>\n               <li class=\"nameList\">Gergely Vakulya<\/li>\n               <li class=\"nameList\">Benjamin Buergisser<\/li>\n               <li class=\"nameList\">Nathan Riopelle<\/li>\n               <li class=\"nameList\">Fabio Zund<\/li>\n               <li class=\"nameList\">Robert W. Sumner<\/li>\n               <li class=\"nameList\">Thomas R. Gross<\/li>\n               <li class=\"nameList Last\">Alanson Sample<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler39\" \/> \n            <label for=\"spoiler39\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> When digital applications aim to blend virtual and real worlds, understanding the\n                     actual physical actions of users becomes an important task; the precise timing of\n                     these tangible interaction events is needed, along with the identity, and possibly\n                     location and history, of all involved actors\/objects. With multiple actors or objects,\n                     it is difficult to identify who touches which object and when. Instrumenting objects\n                     for Body Channel Communication (BCC) allows message exchange around the human body\n                     between instrumented objects and the user themselves. In this paper we show how BCC\n                     can be utilized to perform under real-time conditions so that we can directly notice\n                     touch events (and the identity of actors). TangibleID is a framework that unifies\n                     tangible interaction capture for objects and users based on wearable BCC. TangibleID\n                     provides identification and communication with tagged objects\/users in less than 120&nbsp;ms\n                     and supports a variety of tangible interactions, without the need to restrict user\n                     (hand) movements or to maintain line-of-sight connection to cameras. When an AR application\n                     is combined with TangibleID, a new tangible mixed reality experience is achieved,\n                     as demonstrated in the \u201cHaunted Castle\u201d showcase. The paper presents an end-to-end\n                     technical evaluation including trade-offs regarding robustness and speed of touch\n                     recognition, outlines the breadth of interaction modalities, and reports on an initial\n                     user assessment.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3440659\">Tangible Lighting Proxies: Brokering the Transition from Classroom to Stage<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Rebecca Nicholson<\/li>\n               <li class=\"nameList\">David Kirk<\/li>\n               <li class=\"nameList Last\">Tom Bartindale<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler40\" \/> \n            <label for=\"spoiler40\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Supporting Practice-based vocational skills development in the classroom can be logistically\n                     challenging, however tangible interfaces present natural affordances for supporting\n                     such skill development. In the context of teaching stage lighting specifically, we\n                     present an open source toolkit based around tangible hardware proxies to support the\n                     teaching of practice-based skills. Our tangible proxies embody key configuration,\n                     interaction and optical properties of real stage equipment. Drawing on notions of\n                     representation and bi-directional digital-to-physical transformation, we design a\n                     toolkit that specifically supports a gradual transition between virtual, simulated\n                     and real equipment during the learning journey, and open the door to embedding stage\n                     craft into schools. Through reporting on deployments in two schools, we discuss the\n                     affordances of such proxies and their potential for supporting the teaching and learning\n                     of practice-based skills in the classroom. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            <h2>SESSION: Work in Progress<\/h2>\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442446\">Comparing Understanding and Memorization in Physicalization and VR Visualization<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">He Ren<\/li>\n               <li class=\"nameList Last\">Eva Hornecker<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler41\" \/> \n            <label for=\"spoiler41\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> We investigate whether presenting data in a VR visualization or as a physicalization\n                     impacts understanding and recollection. Two equivalent representations of the same\n                     data set, one in physical form and one in VR, were created. Participants answered\n                     understanding questions while they had access to the model, and were subsequently\n                     asked about the data after the model was removed. We recorded time needed to answer\n                     understanding questions and correctness rates for recollection questions. The results\n                     favour the conclusion that the virtual representation and the technical VR setup significantly\n                     inhibit participants\u2019 ability to work with the data set. Reflecting on our study setup\n                     and participants\u2019 comments, we discuss recommendations for future studies aiming at\n                     a systematic and comprehensive comparison of the differences in interacting with purely\n                     virtual and with physical data representations.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442447\">inDepth: Force-based Interaction with Objects beyond A Physical Barrier<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Takatoshi Yoshida<\/li>\n               <li class=\"nameList\">Junichi Ogawa<\/li>\n               <li class=\"nameList\">Kyung Yun Choi<\/li>\n               <li class=\"nameList\">Sanad Bushnaq<\/li>\n               <li class=\"nameList\">Ken Nakagaki<\/li>\n               <li class=\"nameList Last\">Hiroshi Ishii<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler42\" \/> \n            <label for=\"spoiler42\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>We propose inDepth, a novel system that enables force-based interaction with objects\n                     beyond a physical barrier by using scalable force sensor modules. inDepth transforms\n                     a physical barrier (eg. glass showcase or 3D display) to a tangible input interface\n                     that enables users to interact with objects out of reach, by applying finger pressure\n                     on the barrier\u2019s surface. To achieve this interaction, our system tracks the applied\n                     force as a directional vector by using three force sensors installed underneath the\n                     barrier. Meanwhile, our force-to-depth conversion algorithm translates force intensity\n                     into a spatial position along its direction beyond the barrier. Finally, the system\n                     executes various operations on objects in that position based on the type of application.\n                     In this paper, we introduce inDepth concept and its design space. We also demonstrate\n                     example applications, including selecting items in showcases and manipulating 3D rendered\n                     models.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442448\">NOW YOU SEE ME, NOW YOU DON&#8217;T: revealing personality and narratives from playful interactions with machines being\n                  watched<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">RAY LC<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler43\" \/> \n            <label for=\"spoiler43\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>The complexity of tasks and computations undertaken by machines has grown exponentially.\n                     In order to communicate these complexities to humans working with machines, these\n                     machines must be programmed to express themselves in forms that humans can understand,\n                     not merely with procedures and numbers, but with intentions, expressive gestures that\n                     communicate emotion, and interactions that form stories, all of which can be more\n                     intuitively grasped by humans. I explored machine gestures for communicating personality\n                     and narratives by building a \u201cshy\u201d lamp that looks away when humans gaze closely,\n                     and follows human faces when they&#8217;re far away. I followed by building a group of machines\n                     that all direct their gaze at the human unless she looks away, at which time they\n                     continue performing a skit. Artistic interventions with audiences shows playful interactions\n                     that depend on placement of the camera, showing a way of communicating machine personality\n                     using playful face-detection-based interactions.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442449\">Muffidgets: Detecting and Identifying Edible Pastry Tangibles on Capacitive Touchscreens<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Florian Heller<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler44\" \/> \n            <label for=\"spoiler44\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Detecting tangibles on capacitive touchscreen has seen vast attention over the past\n                     decade. Current state of the art allows a capacitive touchscreen to detect and identify\n                     a number of tangibles based on a unique footprint of interconnected conductive elements\n                     on the base of the tangible. In most cases, this conductive material is either metal\n                     or some carbon-based conductor, possibly integrated into a 3D-printing process. The\n                     choice of conductive material is, however, not limited to these technical elements.\n                     In this paper, we showcase how the concept of tangible detection on capacitive touchscreens\n                     can be transferred to pastry, creating an ephemeral, edible user interface. The detection\n                     and identification of specific pieces of pastry open applications in the area of entertainment,\n                     but also food safety.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442450\">Tangible Signals &#8211; Prototyping Interactive Physical Sound Displays<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Jens Vetter<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler45\" \/> \n            <label for=\"spoiler45\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> This work-in-progress paper presents three tangible user interface prototypes that\n                     use sensor technology in combination with mechanical actuators to provide an interactive\n                     physical display of sound and music data for people with visual impairments. The prototypes\n                     can be used either separately or combined as input and output devices utilizing pin-based,\n                     string-based and wheel-based interaction elements. They were developed as part of\n                     the research project Tangible Signals. In the paper, each prototype will be presented\n                     separately, including discussion of concept, interaction modalities, hardware design\n                     and possible use cases.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442451\">An Interactive Garment for Orchestra Conducting: IoT-enabled Textile &amp; Machine Learning\n                  to Direct Musical Performance<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Berit Greinke<\/li>\n               <li class=\"nameList\">Giorgia Petri<\/li>\n               <li class=\"nameList\">Pauline Vierne<\/li>\n               <li class=\"nameList\">Paul Biessmann<\/li>\n               <li class=\"nameList\">Alexandra B\u00f6rner<\/li>\n               <li class=\"nameList\">Kaspar Schleiser<\/li>\n               <li class=\"nameList\">Emmanuel Baccelli<\/li>\n               <li class=\"nameList\">Claas Krause<\/li>\n               <li class=\"nameList\">Christopher Verworner<\/li>\n               <li class=\"nameList Last\">Felix Biessmann<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler46\" \/> \n            <label for=\"spoiler46\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> We present an overview and initial results from a project bringing together orchestra\n                     conducting, e-textile material studies, costume tailoring, low power computing and\n                     machine learning (ML). We describe a wearable interactive system comprising of textile\n                     sensors embedded into a suit, low-power transmission and gesture recognition using\n                     creative computing tools. We introduce first observations made during the semi-participatory\n                     approach, which placed the conductor\u2019s movements and personal performative expressiveness\n                     at the centre for technical and conceptual development. The project is a two-month\n                     collaboration between the Verworner-Krause Kammerorchester (VKKO), technical and design\n                     researchers, currently still running. Preliminary analyses of the data recorded while\n                     the conductor is wearing the prototype demonstrate that the developed system can be\n                     used to robustly decode a large number of conducting and performative movements. In\n                     particular the user interface of the ML system is designed such that the training\n                     of the algorithms can be intuitively controlled by the conductor, in sync with the\n                     MIDI clock.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442452\">Exploring Axisymmetric Shape-Change\u2019s Purposes and Allure for Ambient Display: 16\n                  Potential Use Cases and a Two-Month Preliminary Study on Daily Notifications<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Maxime Daniel<\/li>\n               <li class=\"nameList Last\">Guillaume Rivi\u00e8re<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler47\" \/> \n            <label for=\"spoiler47\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> In the last decade, HCI research has proposed promising technologies for shape-changing\n                     interfaces. The usefulness and the user experience of shape-change are, however, still\n                     to be explored and understood. This paper extends the understanding of the potential\n                     utility and usability of axisymmetric shape-change. First, we present 16 potential\n                     use cases for a cylindrical shape-changing display. Second, we present a two-month\n                     comparative field study in the workplace. Six participants had to shift energy consumption\n                     by using energy storage. To do so, they were notified about local energy forecasts.\n                     Compared with flat-screen animations, early results show that cylindrical shape-change\n                     animations keep a better attractiveness over time.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442453\">LayerPump: Rapid Prototyping of Functional 3D Objects with Built-in Electrohydrodynamics\n                  Pumps Based on Layered Plates<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Juri Fujii<\/li>\n               <li class=\"nameList\">Satoshi Nakamaru<\/li>\n               <li class=\"nameList Last\">Yasuaki Kakehi<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler48\" \/> \n            <label for=\"spoiler48\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> The development and widespread use of digital fabrication techniques has accelerated\n                     the fabrication of three-dimensional (3D) objects, enabling not only to design their\n                     shapes but also to customize their internal structures and materials to realize various\n                     functions. In this research we propose a novel design method to embed internal pumps\n                     and liquid paths inside 3D objects. We utilized a pump driven by electrohydrodynamics\n                     (EHD) of fluid migrating in response to an applied electric field. To arrange the\n                     EHD pumps and fluid paths in appropriate positions inside 3D objects, we stacked layered\n                     acrylic plates cut with digital fabrication tools such as laser cutters and cutting\n                     plotters. Using these embedded pumps, we can control the flow of fluid to change color,\n                     shape, and motion of the objects. In this paper, we detailed our fabrication techniques\n                     named LayerPump, and demonstrated its design space and application scenarios.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442455\">Paint with Your Mind: Designing EEG-based Interactive Installation for Traditional\n                  Chinese Artworks<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Zitong Chen<\/li>\n               <li class=\"nameList\">Jing Liao<\/li>\n               <li class=\"nameList\">Jianqiao Chen<\/li>\n               <li class=\"nameList\">Chuyi Zhou<\/li>\n               <li class=\"nameList\">Fangbing Chai<\/li>\n               <li class=\"nameList\">Yang Wu<\/li>\n               <li class=\"nameList Last\">Preben Hansen<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler49\" \/> \n            <label for=\"spoiler49\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Museum exhibitions on traditional Chinese paintings are gaining popularity for educational\n                     and cultural value. Chinese paintings are characterized by a long history and implicit\n                     emotional expression, and it is challenging for non-professional and non-Chinese visitors\n                     to understand. To enhance museum visitors\u2019 interest and comprehension of Chinese artworks,\n                     we design an EEG-based interactive installation. The installation simulates the process\n                     of creation of a work of art, in this case a painting. Visitors can control the generation\n                     of lines, colors, and movements of characters by wearing a commercial EEG headset.\n                     Our interactive design contributes a novel experience of \u2019painting with your mind\u2019\n                     and at the same time transform the exhibition into an enjoyable game experience.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442456\">Kinesthetic Empathy in Remote Interactive Performance: Research into Platforms and\n                  Strategies for Performing Online<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Ryan Ingebritsen<\/li>\n               <li class=\"nameList\">Christopher Knowlton<\/li>\n               <li class=\"nameList Last\">John Toenjes<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler50\" \/> \n            <label for=\"spoiler50\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> In interactive performance, kinesthetic empathy refers to the ability of performers\n                     to read, decode, and react to one another\u2019s physical input. To develop kinesthetic\n                     empathy, performers rely on sonic, visual, and other sensory cues. Remote and interactive\n                     performance oftentimes limits these cues due to deficiencies of the virtual environment.\n                     In this case, performers must develop connections between alternate sensory modes\n                     to achieve kinesthetic empathy. This paper explores alternative systems for remote\n                     performance and investigates the ways in which human players creatively exploit these\n                     platforms as well as defining pre-requisite sensory connections needed to achieve\n                     kinesthetic empathy between remote participants. We present three examples of technologies\n                     and performance techniques used to achieve this connection. We then present a new\n                     system modified for remote performance and propose a strategy for demonstrating it\n                     to peers in order to discern its effectiveness in facilitating kinesthetic empathy\n                     between multiple players as well as players and audience. We use current research\n                     data in cognitive psychology as a baseline for our own inquiry and hope our experiences\n                     will inspire future research in that field. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442457\">Therms-Up!: DIY Inflatables and Interactive Materials by Upcycling Wasted Thermoplastic\n                  Bags<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Kyung Yun Yun Choi<\/li>\n               <li class=\"nameList Last\">Hiroshi Ishii<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler51\" \/> \n            <label for=\"spoiler51\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>We introduce a DIY method of creating inflatables and prototyping interactive materials\n                     from wasted thermoplastic bags that easily found at home. We used a inexpensive FFF\n                     3D printer, without any customization of the printer, to heat-seal and patterning\n                     different types of mono and multilayered thermoplastic bags. We characterized 8 different\n                     types of commonly-used product package\u2019s plastic film which are mostly made of polypropylene\n                     and polyethylene, and provided 3D printer settings for re-purposing each material.\n                     In addition to heat-sealing, we explored a new design space of using a 3D printer\n                     to create embossing, origami creases, and textures on thermoplastic bags, and demonstrate\n                     examples of applying this technique to create various materials for rapid design and\n                     prototyping. To validate the durability of the inflatables, we evaluated 9 different\n                     thermoplastic air pouches\u2019 heat-sealed bonding strength. Lastly, we show use-case\n                     scenarios of prototyping products and interface, and creating playful experience at\n                     home. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442458\">Towards Atmospheric Interfaces<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Jessica Broscheit<\/li>\n               <li class=\"nameList\">Qi Wang<\/li>\n               <li class=\"nameList\">Susanne Draheim<\/li>\n               <li class=\"nameList Last\">Kai von Luck<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler52\" \/> \n            <label for=\"spoiler52\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> This paper introduces a preliminary taxonomy to bring the condition of air into the\n                     foreground of human perception. To create this taxonomy, we drew on the foundations\n                     of atmospheric research and studies in the field of human-computer interaction to\n                     provide an overview of different inputs and outputs that enable an interaction with\n                     the air. In addition, we present a potential use case that could benefit from a taxonomy\n                     to allow the development of atmospheric interfaces and empower the transfer of knowledge.\n                     We discuss our findings and conclude with challenges that can be addressed in future\n                     research.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442459\">Impressions at First Touch: Insights on how visually impaired persons form their first impressions of technology<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Robin Abendschein<\/li>\n               <li class=\"nameList\">Selina Pauli<\/li>\n               <li class=\"nameList\">Lukas Schmid<\/li>\n               <li class=\"nameList Last\">Stephan Huber<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler53\" \/> \n            <label for=\"spoiler53\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Human judgements are substantially influenced by first impressions. In previous studies,\n                     researchers contributing to people&#8217;s first impressions of technical artifacts focused\n                     mainly on visual attributes. However, their findings do not apply to visually impaired\n                     people who cannot visually explore technology. Hence, we assume that visually impaired\n                     people rather rely on their haptic perception to get a first impression. To examine\n                     how visually impaired people form their first impressions of technological products,\n                     we conducted an explorative study with three visually impaired participants. We asked\n                     them to evaluate haptic features of mobile phones and speakers using the repertory\n                     grid (RepGrid) method. This method can be applied in research fields at an early stage\n                     when no findings are available yet. To empower the participants to autonomously rate\n                     items, we used a haptic scale. We complemented qualitative results of the RepGrid\n                     technique with observations on how long participants explore technology as well as\n                     a following interview on first impressions. We found eight constructs which can serve\n                     as a basis for a quantitative evaluation on how devices make a haptic first impression.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442460\">Turning everyday objects into passive tangible controllers<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Adam Drogemuller<\/li>\n               <li class=\"nameList\">James Walsh<\/li>\n               <li class=\"nameList\">Ross T. Smith<\/li>\n               <li class=\"nameList\">Matt Adcock<\/li>\n               <li class=\"nameList Last\">Bruce H Thomas<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler54\" \/> \n            <label for=\"spoiler54\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> In augmented reality (AR), gesture\/hand-based interactions are becoming more common\n                     place over tangible user interfaces (TUIs) and physical controllers. Major concerns\n                     regard portability and battery-life with TUIs and physical controllers in an AR environment.\n                     For a TUI or physical controller to be usable \u201don the go\u201d and in the field, they need\n                     to be compact such that they can be easily deployable. Subsequently, it is desirable\n                     to be low-energy or powerless, such that the device remains functional after long\n                     periods. In this paper, we present our initial design towards a powerless, passive\n                     controller that leverages the optical camera of a AR head-mounted display (such as\n                     the Hololens or Magic Leap) for tracking. We discuss related work, design motivations,\n                     high and low fidelity designs, opportunistic\/adaptable input modalities, and use cases.\n                     We conclude with propositions for future work.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442461\">TempoWatch: a Wearable Music Control Interface for Dance Instructors<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Josef Roth<\/li>\n               <li class=\"nameList\">Jan Ehlers<\/li>\n               <li class=\"nameList\">Christopher Getschmann<\/li>\n               <li class=\"nameList Last\">Florian Echtler<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler55\" \/> \n            <label for=\"spoiler55\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> We present TempoWatch, a wearable smartwatch-based interface designed to allow dance\n                     instructors to control music playback and tempo directly on their wrist via touch\n                     gestures using a circular watch display. Dance instructors have unique requirements\n                     with respect to music playback in their classes, in particular the ability to stay\n                     in position while controlling the playback, and to change speed via time-stretching.\n                     However, common stereo decks and mobile music player apps do not support these requirements\n                     well. We present the design and architecture of our system, and a qualitative evaluation\n                     performed with 9 semi-professional instructors in their own dance classes. Dance instructors\n                     were involved in this project from the very beginning to match the system and interface\n                     design to its prospective use cases. Results show that instructors are able to use\n                     TempoWatch productively after only a short learning phase.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442464\">Wearable Crazy Eights: Wearable Ideation Methods for Encouraging Divergent Design Concepts<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Lee Jones<\/li>\n               <li class=\"nameList\">Sara Nabil<\/li>\n               <li class=\"nameList Last\">Audrey Girouard<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler56\" \/> \n            <label for=\"spoiler56\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Participatory design with wearable users entails engaging people in the design process\n                     from the early ideation phases. However, user-generated wearable concepts are often\n                     limited by the narrow design space of commercially available wearables. This paper\n                     presents an ideation scaffolding method we developed for eliciting wearable concepts,\n                     called Wearable Crazy Eights, where participants used an ideation deck and sketched\n                     up to 8 concepts in 8 minutes. Herein, we discuss the artifacts produced from our\n                     ideation method in a study with 46 participants comparing 3 groups. By comparing the\n                     3 groups we were able to parse the effects of each activity on the resulting ideas.\n                     Our contribution is a replicable and customizable ideation method for encouraging\n                     outside-the-box thinking in wearable studies.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442465\">Raising People\u2019s Awareness of Hearing Health with Pleasant and Interactive Hearing\n                  Test Designs Installed in Public Spaces<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Yaliang Chuang<\/li>\n               <li class=\"nameList Last\">Tomas Gecevi\u010dius<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler57\" \/> \n            <label for=\"spoiler57\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Hearing loss is an emerging health challenge, but very few people check their hearing\n                     periodically. Unidentified hearing loss can lead to depression and social isolation.\n                     To improve public awareness of hearing health, we design an interactive testing system\n                     for being used in public spaces. It aims to create unexpected encounters with users\n                     and engage them in examining their hearing health regularly. We conducted two evaluations\n                     and reported the results in this paper. Firstly, we compared the hearing test results\n                     of our design and the conventional audiometry testing. The correlation is 0.78 on\n                     average. Secondly, we evaluated the user experience with 12 visitors in an art museum.\n                     The nature sounds and interactive design provided users a feeling of calm and made\n                     the testing an engaging experience. Meanwhile, the visualization triggered users\u2019\n                     discussions and reflections on the testing results. Overall, our design shows the\n                     potentials in raising people\u2019s awareness of hearing health.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442466\">The Jam Station: Gamifying Collaborative Musical Experiences Through Algorithmic Assessment<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Torin Hopkins<\/li>\n               <li class=\"nameList\">Peter Pascente<\/li>\n               <li class=\"nameList\">Wayne Seltzer<\/li>\n               <li class=\"nameList\">Kellie Masterson<\/li>\n               <li class=\"nameList Last\">Ellen Yi-Luen Do<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler58\" \/> \n            <label for=\"spoiler58\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> The Jam Station is a collaborative musical experience (CME), with embedded sensors\n                     to detect, evaluate, and reinforce collaboration among players of all skill levels.\n                     The system is also gamified, consisting of four instruments that face a centrally\n                     located, vertically oriented, display that visualizes various game states. Playing\n                     any of the instruments affect the game as musical collaboration is assessed and visualized\n                     on the display. The display also includes a \u201cprogress bar\u2019\u2019 which fills up the more\n                     players collaborate musically, and to \u201cwin\u2019\u2019 the game, the progress bar must be full,\n                     initiating a multi-color light show for the players. By creating an environment focused\n                     primarily on collaboration, we aim to support collaborative music-making for musicians\n                     of all skill levels.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442467\">Plantimus \u2013 A Plant Stethoscope<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Ofer Asaf<\/li>\n               <li class=\"nameList\">Ronnie Oren<\/li>\n               <li class=\"nameList\">Shachar Geiger<\/li>\n               <li class=\"nameList Last\">Michal Rinott<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler59\" \/> \n            <label for=\"spoiler59\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>We present \u201cPlantimus\u201d, an external sensory enhancement device that enables children\n                     to \u2018hear\u2019 plants. By embedding a digital layer on top of the physical world, we encourage\n                     children to step-out and reflect upon our inherent connection to nature as well as\n                     our responsibility towards it. Aimed to develop abstract thinking and self-exploration,\n                     the Plantimus is a two-piece device resembling a stethoscope, that generates different\n                     sounds when directed towards plants. The design process, as well as the technical\n                     implementation are described in detail. Insights from the initial user testing are\n                     presented, showing that children react positively to the experience of \u2018hearing\u2019 plants;\n                     and that the experience is also intriguing to adults. Future work involves advancing\n                     the plant recognition method and the specificity of the plant-sound relationship and\n                     testing the device within new scenarios of use.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442468\">Asynchronous Co-Dining: Enhancing the Intimacy in Remote Co-Dining Experience Through\n                  Audio Recordings<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Huizhong Ye<\/li>\n               <li class=\"nameList\">Zengrong Guo<\/li>\n               <li class=\"nameList Last\">Rong-Hao Liang<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler60\" \/> \n            <label for=\"spoiler60\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Co-dining benefits in closeness and connectedness between all family members. However,\n                     it is hard for those in different locations to enjoy a meal together due to the geographical\n                     barriers and inconsistent life schedule. The widely applied video technology caused\n                     excessive visual interference when solving problems, while the effect of sound in\n                     a meal on intimacy has not been studied. In this paper, we explored how can recorded\n                     audio create an asynchronous co-dining experience and support intimacy. We conducted\n                     a field study with 24 participants, and the result shows the system potentially enhance\n                     the intimacy in remote coding experience. In the follow-up workshop with 6 participants,\n                     we further discussed the future landscape of designing a sound interface for remote\n                     meals.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442469\">Soundyssey: Hybrid Enrichment System for Elephants in Managed Care<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Yating Wu<\/li>\n               <li class=\"nameList\">Harpreet Sareen<\/li>\n               <li class=\"nameList Last\">Gabriel A. Miller<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler61\" \/> \n            <label for=\"spoiler61\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Animals in managed care ideally are provided with environmental stimuli for their\n                     psychological and physiological well-being. Most commonly, food-based enrichment methods\n                     are used to mimic wild conditions, allowing animals to search and forage. Other sensory\n                     stimulation devices may be employed to reduce stereotypical behaviors. However, many\n                     of these devices fall short in providing choice and control to animals, an important\n                     factor in cognitive engagement. In this paper, we present a case study conducted at\n                     the San Diego Zoo that explores such design for two elephants. The resultant system,\n                     called Soundyssey, is an active auditory enrichment device for elephants that encourages\n                     play behaviors through embodied interaction. Our preliminary results indicate that\n                     elephants understood the system and interacted with the interface significantly longer\n                     than with passive objects. During the interactions, elephants showed more positive\n                     behaviors such as focused exploration, reducing the possibility of negative stereotypical\n                     behaviors. We suggest that such technologically-enhanced objects and embodied design\n                     can enhance standards of managed animal care.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442470\">The Rhythm of the Robot: A Prolegomenon to Posthuman Somaesthetics<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Anne-Sofie Belling<\/li>\n               <li class=\"nameList Last\">Daniel Buzzo<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler62\" \/> \n            <label for=\"spoiler62\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> This work-in-progress paper describes an investigation into the somaesthetic qualities\n                     of exploring posthuman strategies and relations with technology. This was done through\n                     bodystorming and physical prototyping, casting soft robots directly onto the body.\n                     Through a Research-through-Design process, a soft robotic prototype, Nautica, was\n                     developed. This prototype is based on embodied design ideation, autoethnographic bodystorming\n                     and prototyping in soft robotics. Nautica explores the concept of an embodied navigation\n                     system that plays with re-embodying the disembodied virtual body found in digital\n                     navigation systems. The Nautica prototype was tested directly on the designer-researcher\u2019s\n                     body which led to discerning preliminary emergent properties. Over a short-term adoptive\n                     study it was discovered that several \u201cmaterial\u201d commonalities between the wearer and\n                     robot emerged due to the materiality and behavior of soft robots. These insights may\n                     suggest new ways of experiencing hybrid somatic relations with technology. This work-in-progress\n                     ends on suggesting further work such as connecting the soft robot to a real, functioning\n                     navigation system and to expand the testing of the Nautica prototype.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442471\">KNOBIE: A Design Intervention for Supporting Chefs\u2019 Sustainable Recipe Planning Practices<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">H\u00fcseyin Ugur Gen\u00e7<\/li>\n               <li class=\"nameList\">Hakan Yilmazer<\/li>\n               <li class=\"nameList Last\">Aykut Coskun<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler63\" \/> \n            <label for=\"spoiler63\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Reducing the environmental impact of current food production and consumption practices\n                     is a significant challenge for a more sustainable future. Even though previous HCI\n                     studies illustrated that design interventions could support more sustainable food\n                     practices in domestic contexts, little attention has been given to hospitality contexts\n                     (e.g., restaurants). Addressing this gap, we first investigated food preparation practices\n                     in restaurants (i.e., how recipes, menus and meals are prepared) through interviews\n                     with 10 chefs and instructors of culinary arts. Then, we designed KNOBIE, a design\n                     intervention aimed at supporting chefs\u2019 sustainable food preparation practices through\n                     better recipe and menu planning. In this paper, we present the results of these interviews,\n                     KNOBIE as a concept to support sustainable recipe planning and how interviews guided\n                     its design.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            <h2>SESSION: Studios<\/h2>\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442699\">TactJam: a collaborative playground for composing spatial tactons<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Dennis Wittchen<\/li>\n               <li class=\"nameList\">Bruno Fruchard<\/li>\n               <li class=\"nameList\">Paul Strohmeier<\/li>\n               <li class=\"nameList Last\">Georg Freitag<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler64\" \/> \n            <label for=\"spoiler64\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Tactons are vibrotactile patterns used to convey information. Conventionally, a small\n                     set of tactons is used on mobile phone or wearables to notify users of messages or\n                     emails received. However, tactons have multiple characteristics that one can vary\n                     to design a multitude of vibrotactile patterns. In addition, one can place vibrotactile\n                     actuators on different body areas to leverage the spatial dimension and the varying\n                     skin sensitivity across the body. Prior work proposed methods to design tactons based\n                     on musical or engineering knowledge, but hands-on methods remain scarce. For this\n                     studio, we adopt a hands-on approach for composing tactons. We leverage a simple instrument-like\n                     device that consists of vibrotactile actuators connected to dedicated buttons for\n                     designing tactons. While pressing a button, the corresponding actuator vibrates. Users\n                     can vary several characteristics of the tactons (e.g., duration and amplitude), and\n                     experience them in real time during the design process by placing the actuators on\n                     the body. These tactons can then be shared with other participants of the studio.\n                     Our goal with this studio is to observe users compose tactons collaboratively using\n                     a hands-on device, and better understand how they lay out the vibrotactile actuators\n                     on their body and what differences these layouts make in the tactile experience.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442700\">Tangible Body Maps of Felt-Sensing Experience<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Claudia N\u00fa\u00f1ez-Pacheco<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler65\" \/> \n            <label for=\"spoiler65\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>There is a growing interest amongst the HCI community to access and articulate the\n                     core of experiences for design use. The premise is that, by accessing detailed accounts\n                     of everyday experiences, we can obtain refined material for the design of interactive\n                     systems more connected with our bodies and emotions. This TEI studio aims to introduce\n                     participants to the basis of phenomenologically grounded techniques in combination\n                     with the use of tangible materials as a way to articulate experience from the inner\n                     self, applied to the evaluation of existing technologies. This studio offers an alternative\n                     to assessment tools that rely on a predefined repertoire of feelings, to instead focus\n                     on emergent, complex and unclear aspects of our emotions. These strange collections\n                     of emotions -or felt senses- will be further explored through self-reporting tools\n                     and group exercises.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442701\">Material Meets the City: Exploring Novel and Advanced Materials for the Smart Urban\n                  Built Environment<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Linda Hirsch<\/li>\n               <li class=\"nameList\">Eleni Economidou<\/li>\n               <li class=\"nameList\">Irina Paraschivoiu<\/li>\n               <li class=\"nameList\">Tanja D\u00f6ring<\/li>\n               <li class=\"nameList Last\">Andreas Butz<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler66\" \/> \n            <label for=\"spoiler66\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> The urban realm is currently undergoing a transformation in which cities are being\n                     laced with sensors and networks of ever-connected devices. At the same time, more\n                     and more novel and advanced materials are finding their way into interaction design\n                     and Human-Computer Interaction (HCI) research, offering us new interaction possibilities.\n                     In this half-day online studio, attendees will have the opportunity to exchange and\n                     reflect on urban interaction designs for user engagement based on a set of novel,\n                     unconventional, or omnipresent materials in urban environments. We will further collect\n                     historical, public, and community locations of different social and societal meaning.\n                     Participants will be asked to develop hands-on concepts and use cases for the selected\n                     locations through a material-centered approach and think about new levels of user-environment\n                     engagement to extend and explore the current design space. The formulated concepts\n                     and use cases will be recorded in an online collaboration tool with the prospect to\n                     publish them as enhances for the new generation of media architecture.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442702\">Design for Playfulness with Interactive Soft Materials: Description document<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Se\u00e7il U\u011fur Yavuz<\/li>\n               <li class=\"nameList\">Paula Veske<\/li>\n               <li class=\"nameList\">Barbro Scholz<\/li>\n               <li class=\"nameList\">Michaela Honauer<\/li>\n               <li class=\"nameList Last\">Kristi Kuusk<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler67\" \/> \n            <label for=\"spoiler67\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Designing for interactive play brings new dimensions for play experience blended with\n                     digital and physical characteristics. Starting from the insights of two interactive\n                     soft toy projects, the authors explore interactive soft materials in designing for\n                     playfulness through a material-driven design approach. This can bring a new dimension\n                     in designing interaction for ludic experiences and reveal specific characteristics\n                     of soft materials. By combining somaesthetic and embodied design methods, the studio\n                     aims at exploring what makes soft materials playful. Moreover, it addresses how technology\n                     can enhance the playfulness of these materials and what kind of playful interaction\n                     scenarios can be imagined with interactive soft materials by involving the whole-body\n                     in the design process. The learnings from the studio will help the authors build a\n                     framework which shows the potential and describes the characteristics of interactive\n                     soft materials to create new tangible, embedded and embodied interactions for play.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442703\">Interactive Machine Learning for Embodied Interaction Design: A tool and methodology<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Nicola Plant<\/li>\n               <li class=\"nameList\">Clarice Hilton<\/li>\n               <li class=\"nameList\">Marco Gillies<\/li>\n               <li class=\"nameList\">Rebecca Fiebrink<\/li>\n               <li class=\"nameList\">Phoenix Perry<\/li>\n               <li class=\"nameList\">Carlos Gonz\u00e1lez D\u00edaz<\/li>\n               <li class=\"nameList\">Ruth Gibson<\/li>\n               <li class=\"nameList\">Bruno Martelli<\/li>\n               <li class=\"nameList Last\">Michael Zbyszynski<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler68\" \/> \n            <label for=\"spoiler68\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> As immersive technologies are increasingly being adopted by artists, dancers and\n                     developers in their creative work, there is a demand for tools and methods to design\n                     compelling ways of embodied interaction within virtual environments. Interactive Machine\n                     Learning allows creators to quickly and easily implement movement interaction in their\n                     applications by performing examples of movement to train a machine learning model.\n                     A key aspect of this training is providing appropriate movement data features for\n                     a machine learning model to accurately characterise the movement then recognise it\n                     from incoming data. We explore methodologies that aim to support creators\u2019 understanding\n                     of movement feature data in relation to machine learning models and ask how these\n                     models hold the potential to inform creators\u2019 understanding of their own movement.\n                     We propose a 5-day hackathon, bringing together artists, dancers and designers, to\n                     explore designing movement interaction and create prototypes using new interactive\n                     machine learning tool InteractML.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3442704\">From \u201dExplainable AI\u201d to \u201dGraspable AI\u201d<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Maliheh Ghajargar<\/li>\n               <li class=\"nameList\">Jeffrey Bardzell<\/li>\n               <li class=\"nameList\">Alison Smith Renner<\/li>\n               <li class=\"nameList\">Peter Gall Krogh<\/li>\n               <li class=\"nameList\">Kristina H\u00f6\u00f6k<\/li>\n               <li class=\"nameList\">David Cuartielles<\/li>\n               <li class=\"nameList\">Laurens Boer<\/li>\n               <li class=\"nameList Last\">Mikael Wiberg<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler69\" \/> \n            <label for=\"spoiler69\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Since the advent of Artificial Intelligence (AI) and Machine Learning (ML), researchers\n                     have asked how intelligent computing systems could interact with and relate to their\n                     users and their surroundings, leading to debates around issues of biased AI systems,\n                     ML black-box, user trust, user\u2019s perception of control over the system, and system\u2019s\n                     transparency, to name a few. All of these issues are related to how humans interact\n                     with AI or ML systems, through an interface which uses different interaction modalities.\n                     Prior studies address these issues from a variety of perspectives, spanning from understanding\n                     and framing the problems through ethics and Science and Technology Studies (STS) perspectives\n                     to finding effective technical solutions to the problems. But what is shared among\n                     almost all those efforts is an assumption that if systems can explain the how and\n                     why of their predictions, people will have a better perception of control and therefore\n                     will trust such systems more, and even can correct their shortcomings. This research\n                     field has been called Explainable AI (XAI). In this studio, we take stock on prior\n                     efforts in this area; however, we focus on using Tangible and Embodied Interaction\n                     (TEI) as an interaction modality for understanding ML. We note that the affordances\n                     of physical forms and their behaviors potentially can not only contribute to the explainability\n                     of ML systems, but also can contribute to an open environment for criticism. This\n                     studio seeks to both critique explainable ML terminology and to map the opportunities\n                     that TEI can offer to the HCI for designing more sustainable, graspable and just intelligent\n                     systems.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            <h2>SESSION: Graduate Student Consortium<\/h2>\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3443688\">Tactile Design Principles<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Philipp Ballin<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler70\" \/> \n            <label for=\"spoiler70\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Vibrotactile feedback has found its way into human-machine interaction in recent\n                     years. However, there are no easily accessible and recognized rules for designing\n                     this feedback. For example, designers of graphical user interfaces can make use of\n                     visual design laws, to improve the design of human-machine interfaces. These laws\n                     identify a meta-level that allows to work with simple design principles rather than\n                     having to deal with the physiology or the psychological and cognitive stimulus processing\n                     of the visual sense. <\/p> \n                  <p>The goal of my future research is to explore and identify a meta-level for the design\n                     of tactile feedback in order to provide tactile feedback designers with a tool that\n                     they can work with easily and effectively. A pilot study has already been carried\n                     out to this end, which has yielded illuminating results. Based on these results, I\n                     would like to sharpen the focus of my research work in the coming months and then\n                     work on my doctoral thesis. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3443689\">Sleep-Mode: On Sleeping With Wearable Technology<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Anna Nolda Nagele<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler71\" \/> \n            <label for=\"spoiler71\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> This paper introduces a Ph.D. research project on the affective relationships between\n                     users and wearable sleep-trackers. Sleep monitoring is a recent addition to wearable\n                     technology, computing in the intimate sphere of the body. Wearable devices interact\n                     with the human body, often bypassing conscious thought and manipulating behaviours.\n                     The acceptance of sleep-trackers transforms the discourse of sleep from a passive,\n                     non-reflexive experience into an active, measurable performance. The sleeping body\n                     becomes part of a network of sensing devices where human behaviour becomes operationalised.\n                     This exploratory research project aims to untangle the effects of sleep-tracking on\n                     individual and social levels. Through analysis of language and discourse a first study\n                     aims to identify the human and non-human subjectivities such technology produces and\n                     the affective relationships between them. The findings of this project will provide\n                     a starting point for a posthuman approach to designing wearable technology.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3443690\">Developing Multisensory Augmented Reality As A Medium For Computational Artists<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Sam Bilbow<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler72\" \/> \n            <label for=\"spoiler72\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> This paper resituates multisensory augmented reality (MSAR) as an artistic medium\n                     for the creation of interactive and expressive works by computational artists. If\n                     an AR system can be thought of as one that combines real and virtual processes, is\n                     interactive in real-time, and is registered in three dimensions; why do we witness\n                     the majority of AR applications utilising primarily visual displays of information?\n                     In this paper, I propose a practice-led compositional approach for developing \u2018MSAR\n                     Experiences\u2019, arguing that, as an medium that combines real and virtual multisensory\n                     processes, it must be explored with a multisensory approach. The paper further outlines\n                     the study methods that I will use to evaluate the developed experiences. The outcome\n                     of this project is the practice-led method as well as MSAR hardware, software and\n                     experiences that are developed and evaluated.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3443691\">Touching the Past: Developing and Evaluating A Heritage kit for Visualizing and Analyzing Historical\n                  Artefacts Using Tangible Augmented Reality<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Suzanne Kobeisse<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler73\" \/> \n            <label for=\"spoiler73\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>My research explores the use of tangible user interfaces and augmented reality to\n                     interact with virtual representations of historical artefacts within the cultural\n                     heritage domain. As a practice-led research, this research project aims to design,\n                     develop and evaluate tangible user interfaces combined with augmented reality to provide\n                     cultural heritage professionals and archaeology experts with an accessible and intuitive\n                     tangible augmented reality interface for the visualization, analysis and interpretations\n                     of historical artefacts. This research is interdisciplinary in nature combining the\n                     fields of design, HCI and heritage; and the newly explored interaction methods will\n                     suggest new possibilities for the future of Tangible User Interfaces and augmented\n                     reality research in general and cultural heritage in particular. In the first year\n                     of my PhD, I developed a prototype that showcased a tangible interface to interact\n                     with augmented historical artefacts, and preliminary testing with experts yielded\n                     very positive results. In the second year, the research will investigate how different\n                     tangible interaction methods in augmented reality can promote realism and immersion\n                     and enhance the user experience.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3443692\">Designing Interactive Technological Interventions for Menopausal Women: Designing and developing Interactive Technology tools to help aging women navigate\n                  information about stages of menopause to increase self-awareness of biopsychosocial\n                  changes and manage lifestyle for an improved Quality of Life<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Bhairavi Warke<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler74\" \/> \n            <label for=\"spoiler74\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> The experience of aging differs vastly for men and women. For women, the often-slow\n                     transition that ends their reproductive capacity \u2013 menopause \u2013 closely relates to\n                     their experiences of aging. Menopause has a significant impact on the biological,\n                     psychological and social aspects of a woman\u2019s life. While research suggests that education,\n                     guidance, and self-management techniques are beneficial for improving menopausal women\u2019s\n                     health, longevity and quality of life, most women lack access to credible resources.\n                     Self-tracking technology offers women promising alternatives, but little has been\n                     designed to support menopausal women\u2019s health. This doctoral research focuses on developing\n                     interactive self-tracking tools to help women access information about their menopausal\n                     journey, increase self-awareness of biological and psychological changes, and provide\n                     techniques to help adapt and manage their lifestyle. This paper outlines the process\n                     and plan for developing an ethnographic web application and an interactive wearable\n                     tracking device to help women better navigate their menopausal journey.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3443693\">Personal Wearable Light Space<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Barbro Scholz<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler75\" \/> \n            <label for=\"spoiler75\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> This paper introduces a doctoral project on personal wearable light spaces. The topic\n                     combines textile material design with somaesthetic interaction design, investigating\n                     possible socio-cultural implications when humans become light sources. The project\n                     applies material speculation to explore future scenarios through physical prototypes.\n                     Objectives and the research question are stated, and the methodological framework\n                     is described. The project timeline and expected research contributions are listed.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            <h2>SESSION: Arts and Performance<\/h2>\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444636\">AuxeticBreath: Changing Perception of Respiration&nbsp;<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Hye Jun Youn<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler76\" \/> \n            <label for=\"spoiler76\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>AuxeticBreath is an interactive new-media installation that visualizes the rhythmic\n                     respiratory rate, as well as tidal volume &#8211; the amount of air displaced or exchanged\n                     in a single breath &#8211; of collective human breaths using soft robotics covered with\n                     auxetic structures (i.e. structures with a negative Poisson&#8217;s ratio, exhibiting the\n                     property of becoming thicker when stretched and narrower when compressed) [1]. The\n                     goals of this artwork are 1) to encourage audience interaction with collective breaths\n                     and user contemplation of the changing perception of respiration during the COVID-19\n                     pandemic; and 2) to explore a new artistic approach using a combination of auxetic\n                     structures and soft robotics. The metaphors and artistic expressions of continuous\n                     inflation and deflation of elastomers, and the emission of light from the expansion\n                     of auxetic structures invite an individual&#8217;s presence to become part of the larger\n                     collective installation, and to take a moment to consider underlying changing perceptions\n                     of breath during the pandemic. By employing an emerging technology, we want to encourage\n                     other artists to explore and modify techniques and methods generally only used among\n                     engineers, and to embrace them as new artistic approaches for realizing their own\n                     ideas.&nbsp;<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444637\">\u2018Orrery Arcana\u2019: An Esoteric System for Improvisational Performance<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Nicole L. Carroll<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler77\" \/> \n            <label for=\"spoiler77\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Orrery Arcana&nbsp;is a system for real-time improvisational performance designed to facilitate\n                     a process analogous to automatic writing. The system includes custom software for\n                     real-time signal processing written in Max\/MSP and Python and a self-made hardware\n                     controller that is integrated with a planetary gear train, which gives the performer\n                     control over timing and sequenced events through manual gear rotations. Each gear\n                     is equipped with a sensor plate with embedded light, magnetic, and capacitive-touch\n                     sensors. The sensors are primarily manipulated via tactile modular control objects\n                     in the form of concentric rings of colored acrylic and inset magnets that correspond\n                     to Tarot cards.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444638\">Roads in You: Matching and Visualizing Veins and Roads in 3D<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Yoon Chung Han<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler78\" \/> \n            <label for=\"spoiler78\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Roads in You is an interactive biometric-data art and physical 3D data visualization\n                     by using a vein matching process to roads. Veins are beautifully complicated forms\n                     hidden under skins. It includes many intersections of roads that resemble the roads\n                     and paths surrounding us. In this artwork, participants scan their veins, capture\n                     the aesthetically formed vein lines, and observe existing roads in the earth that\n                     match to the veins inside of their bodies. The surprising results discovered through\n                     this process are visualized in both interactive 3D map visualization and 3D printed\n                     sculptures as personal souvenirs as a part of the artistic experience. Unique aspects\n                     such as artistic expression and situated metaphors in this artwork invite audiences\n                     to experience a unique matching process and discover newly meaningful roads for them.\n                     Enhanced 3D data visualization and physical data sculptures are new additions to this\n                     updated version. This data-driven artwork aims to find an interesting correlation\n                     between biometric data and map-based data in terms of research questions.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444639\">\u201cFootsie\u201d: Exploring Physical Human-Machine-Interaction through Flirtatious Furniture<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Young Suk Lee<\/li>\n               <li class=\"nameList Last\">Daniel Saakes<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler79\" \/> \n            <label for=\"spoiler79\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>\u201cFootsie\u201d is an interactive kinetic art installation. It inspires to connect people\n                     through the counter-intuitive physical interaction that is mediated by flirtatious\n                     machines. It consists of a table and four robotic chairs; the chair legs carry the\n                     motion of \u201cfootsie\u201d and arms caress the back of the participant. While it tackles\n                     the controversial culture and social norms on touching others body or being touched,\n                     an experimental bodily experience explores new aesthetic interaction between lonely\n                     human and a mischievous digital entity. The interaction is provocative in a slightly\n                     uncomfortable way, and the paradoxical feelings and stimulating sensations open simultaneous\n                     interpretations on the meaning of Human-Machine-Interaction. Through nature inspired\n                     design like human spine, flexible muscle of mollusca and a caterpillar\u2019s undulating\n                     wave motion, these pliable structures were fabricated with 3D printed meta-materials\n                     to create the sensual gestures. <\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444640\">is a rose \u2013 A Performative Installation between the Tangible and the Digital<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Charlotte Triebus<\/li>\n               <li class=\"nameList\">Ivana Druzetic<\/li>\n               <li class=\"nameList\">Bastian Dewitz<\/li>\n               <li class=\"nameList\">Calvin Huhn<\/li>\n               <li class=\"nameList\">Paul Kretschel<\/li>\n               <li class=\"nameList Last\">Christian Geiger<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler80\" \/> \n            <label for=\"spoiler80\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>The performative art installation is a rose uses plants as natural interfaces in order\n                     to establish a direct relation between natural and technological systems. The installation\n                     is used to visualize digital-physical interaction that is not necessarily explicit\n                     \u2013 triggered by touch or air movement, by direct and non-direct manipulation, depicting\n                     the sum of all interactions during the course of the day. The technical realization\n                     consists of detection of the movement of plants caused by the movements in their immediate\n                     vicinity and the subsequent deformation of a computer-generated sphere. The paper\n                     is explaining several different layers of meanings the artist was motivated by when\n                     developing the artwork.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444641\">Topographie Digitale<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Audrey Briot<\/li>\n               <li class=\"nameList\">Martin De De Bie<\/li>\n               <li class=\"nameList\">Alice Giordani<\/li>\n               <li class=\"nameList\">Leon Denise<\/li>\n               <li class=\"nameList Last\">Cedric Honnet<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler81\" \/> \n            <label for=\"spoiler81\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Topographie Digitale is an interactive installation that illustrates a hybridization\n                     between science and traditional textile craftsmanship. It uses electrically functionalized\n                     and pleated textiles as touch sensitive surfaces for interacting with a video-projected\n                     visualization. The pleated fabric, augmented by our custom chemical process, and the\n                     electronic sensing system give birth to a material with a mixed heritage that is both\n                     technological and traditional, and prefigure an emerging craft. <\/p> \n                  <p>The combination of craft and technology, which gives a creole technique, is an alternative\n                     way of thinking about the place of digitalization in our society in a more resilient\n                     way.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            <h2>SESSION: Student Design Challenge<\/h2>\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444700\">The E-darning Sampler: Exploring E-textile Repair with Darning Looms<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Lee Jones<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler82\" \/> \n            <label for=\"spoiler82\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Extending the life of our clothes is the most effective intervention of all current\n                     sustainable textile practices. Taking this into consideration, this paper explores\n                     how we can upcycle and repair our clothes with e-textiles, and how to share these\n                     techniques with other makers and crafters. To do so the author interviewed 4 visible\n                     mending educators about their teaching practices and personal experiences with mending.\n                     These interviews were then used to inform the design of the E-Darning Sampler for\n                     the TEI 2021 swatchbook, which includes examples of different e-textile darning patterns\n                     and functionalities made with darning looms. This paper contributes insights on how\n                     to design educational samples for encouraging sustainable making practices.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444701\">Slider Swatch: an inconspicuous switch design solution for electronic textiles: Development of a slider switch from a simple on\/off mechanism to different outputs<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Pia Obermaier<\/li>\n               <li class=\"nameList\">Stela Blagova<\/li>\n               <li class=\"nameList\">Joana Breyton<\/li>\n               <li class=\"nameList Last\">Anna Blumenkranz<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler83\" \/> \n            <label for=\"spoiler83\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> In this paper we would like to explain our approach to the development of this swatch,\n                     from the idea to the possible applications of the final product. We will explain how\n                     we came up with the basic idea, which questions we asked ourselves and what kind of\n                     materials we used to achieve the desired result. Furthermore, we will also explain\n                     what problems we had to deal with during the development process and how we found\n                     a way to solve them. Moreover, we will have a look at how our Swatch can be integrated\n                     into everyday life and what possibilities its use could offer. It is important to\n                     us to convey our way of thinking and working during the process, so we also added\n                     a little step by step guide that tells how the swatch is made.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444702\">Using the power of pixels in the physical world: Bridging digital visual tools to control the world of physical computing<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Roumen, G.J.<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler84\" \/> \n            <label for=\"spoiler84\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Mapping values is part of the craftmanship of tangible interaction design. It is easy\n                     to imagine but hard to put in words and even harder to code. This swatch presents\n                     ways that might allow us to move faster from idea to experience. Through direct manipulation\n                     the experience becomes some sort of \u2018clay\u2019 that we can experience and manipulate simultaneously.\n                     This is done by using pixels to control the physical world and allows the use of software\n                     in the graphical domain to be used in the physical computing domain.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444703\">Fiber-sense: the exploration of the craft and material of fiberglass as a medium for\n                  tangible user interfaces.: Towards the development of embedded circuits in fiberglass-based composites and designs<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Aaron, a r, Chooi<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler85\" \/> \n            <label for=\"spoiler85\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>In this investigation we present a material driven inquiry into fibreglass and the\n                     hand lay-up technique of composite lamination, as well as its affordances for the\n                     development of a tangible user interface. This was achieved by taking an in-depth\n                     look at woven fibreglass and its propensities as a craft material through its varying\n                     states of pliable fabric, wetted mesh and hardened composite. Through this, we propose\n                     the potential design of a pressure activated fibreglass skin for rigid and semi rigid\n                     fibreglass structures. Primarily through the means of a tri layer laminated fibreglass\n                     structure, with a polypyrrole treated fibreglass core sandwiched between 2 embroidered\n                     conductive fibreglass layers.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444704\">The Undyeing Swatch<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Fiona Bell<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler86\" \/> \n            <label for=\"spoiler86\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> The Undyeing Swatch utilizes a combination of visible Light Emitting Diodes (LEDs)\n                     and photocatalytic nanoparticles to diminish the color of organically dyed textiles.\n                     As such, this swatch explores \u2019undyeing\u2019 as a design process that utilizes light and\n                     dye as materials for controlled interaction. The swatch itself consists of a knitted\n                     cotton I-cord coated with silver doped titanium dioxide nanoparticles (TiO2\/Ag) and\n                     dyed with hibiscus. The I-cord is used to encase a strand of LEDs and was then continuously\n                     woven into a swatch. When the LEDs are turned on, the light activates the nanoparticles\n                     which in turn break down organic matter (the dye). This swatch provides a proof of\n                     concept for the undyeing process, which I believe could be an interesting area of\n                     future exploration for HCI researchers and artists alike.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444705\">Cyborg Crafts: Second SKIN (Soft Keen INteraction)<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">S. Sandra Bae<\/li>\n               <li class=\"nameList Last\">Mary Etta West<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler87\" \/> \n            <label for=\"spoiler87\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> Traditional handcraft and modern cyborg culture share a common goal: democratize\n                     creation through demonstrations and education. Cyborg Crafts blends techniques from\n                     the fiber arts with cyborg-inspired technologies (e.g., open-source biosensing EEG\n                     headsets and RFID implants). Second SKIN (Soft Keen INteraction), intended to support\n                     this practice, is a handmade collection of four modular soft wearable sensors with\n                     a temperature-dependent dynamic display. Each sensor has unique sensor-specific outer\n                     shell textures based on non-woven textile techniques, and each supports a different\n                     sense: momentary switch, pressure sensor, pinch sensor, and a gesture-detecting, capacitive\n                     touch sensor. The interactions of pressing, pinching, and touching are encouraged\n                     by sensor-specific extruded designs lending to finger placement. The outer shell textures\n                     are made from a mixture of flaxseed mucilage and silicone rubber. Thermochromic pigment\n                     additives endow display functionality to these passive devices through the application\n                     of heat in excess of 86\u00b0F.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3444706\">Polymerized Tape<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Narjes Pourjafarian<\/li>\n               <li class=\"nameList Last\">Paul Strohmeier<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler88\" \/> \n            <label for=\"spoiler88\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p> We present polymerized sports tape as a general purpose prototyping and sensor-design\n                     resource. We use it in a somewhat similar manner to how conductive copper tape is\n                     often used for fast production of lo-fi electrical prototypes. However, copper tape\n                     has a number of drawbacks if one is prototyping with soft materials, textiles, or\n                     even directly on the human body. Because it is not elastic, it does not adhere well\n                     to such materials. Sports tape, however, has the desired elastic qualities, and additionally\n                     is designed to adhere not only to arbitrary objects, but also to human skin. We polymerize\n                     sports tape to make it conductive. The resulting conductive tape has a series of electrical\n                     properties which make it an exciting sensor-material. It is piezo-resistive. This\n                     means its resistance changes with applied forces. It can also be used to create a\n                     voltage gradient, which enables simple and precise position sensing. This unique combination\n                     of mechanical and electrical properties makes it an exciting and unique prototyping\n                     resource.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            <h2>SESSION: Pictorials<\/h2>\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446065\">Plant-Human Embodied Biofeedback (pheB): A Soft Robotic Surface for Emotion Regulation\n                  in Confined Physical Space<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Elena Sabinson<\/li>\n               <li class=\"nameList\">Isha Pradhan<\/li>\n               <li class=\"nameList Last\">Keith Evan Green<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler89\" \/> \n            <label for=\"spoiler89\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>\u201cpheB,\u201d is a robotic \u201cplant-human, embodied, biofeedback\u201d system to support the wellbeing\n                     of human inhabitants in confined, physical spaces. This surface aims to increase users\u2019\n                     emotion regulation and foster connections with nature by visualizing the internal\n                     states of plants through tactile, expressive movement. Unlike 2D biofeedback visualization\n                     models currently in use, our research explores mindfulness practices through immersive,\n                     tangible interactions to increase therapeutic effectiveness. This pictorial traces\n                     the development of our design (to-date) and presents results from an early user study\n                     conducted to (a) assess the prototype at leading breathing exercises, (b) evaluate\n                     preferences for different design features, and (c) refine the design of a questionnaire\n                     for future user testing. Findings suggest pheB was perceived positively and as an\n                     embodied extension of self during guided breathing exercises. This work contributes\n                     knowledge toward developing novel biofeedback modalities and offers a design exemplar\n                     for interactive artifacts that nurture meaningful relationships with nature.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446066\">Magnetform: a Shape-change Display Toolkit for Material-oriented Designers<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Iddo Yehoshua Wald<\/li>\n               <li class=\"nameList Last\">Oren Zuckerman<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler90\" \/> \n            <label for=\"spoiler90\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>This work presents Magnetform, a shape-change display toolkit designed to enable exploration\n                     of movement in soft materials. The toolkit allows designers with no technical knowledge\n                     to leverage their material expertise to experiment with shape-change. We present the\n                     toolkit design, and case studies with two design studios who used the toolkit for\n                     15-days each. Through the presentation of their process, we reflect on two main themes:\n                     empowering designers to participate in shape-change exploration; and the developing\n                     practice of designing objects which integrate motion. We situate this work as part\n                     of the growing efforts in the TEI community to involve designers in the evolution\n                     of shape-changing interfaces, and demonstrate how material-oriented designers could\n                     contribute to this field in which materiality plays a major role in.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446067\">Designing Gaming Wearables: From Participatory Design to Concept Creation<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Sangwon Jung<\/li>\n               <li class=\"nameList\">Ruowei Xiao<\/li>\n               <li class=\"nameList\">O\u011fuz &#8216;Oz&#8217; Buruk<\/li>\n               <li class=\"nameList Last\">Juho Hamari<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler91\" \/> \n            <label for=\"spoiler91\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>In this pictorial, we depict our design process on gaming wearables starting from\n                     participatory design workshops to concept creation. Wearables possess strong qualities\n                     for gaming such as performativity, sociality and interactivity. However, it is an\n                     emergent field and there is a dearth of design knowledge especially when it comes\n                     to designing wearables for mainstream gaming platforms such as game consoles. Our\n                     aim is to explore this field elaborately with a research through design approach and\n                     also clearly exemplify how our design process progressed through different phases.\n                     Our results, apart from helping wearables designers to understand critical features\n                     for mainstream gaming, will also demonstrate the techniques and methods for extracting\n                     knowledge from PD workshops and incorporating it in a conceptual design phase.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446068\">What&#8217;s Going \u00d6n?: An Experiential Approach to Perspective Taking in Urban Planning through Virtual Reality<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Sam van der Horst<\/li>\n               <li class=\"nameList Last\">Jeroen Peeters<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler92\" \/> \n            <label for=\"spoiler92\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Citizen participation in urban planning has become more important in the past decades.\n                     In recent years, the practice of co-creation has become a popular tool to structure\n                     this participatory process. Co-creation is a process that actively involves stakeholders\n                     in cooperatively sharing and developing opinions and perspectives towards an outcome\n                     that contributes to the overall development. In particular in large and slow processes\n                     such as urban planning, co-creation risks becoming abstract and difficult to engage\n                     with for stakeholders. This study investigates an experiential approach to participation\n                     in urban planning. By building on embodiment to engage participants physically and\n                     expressively, this study aims to elicit an understanding between different points\n                     of view. With the design of the \u201cWhat&#8217;s going \u00d6n?\u201d experience we explore how Virtual\n                     Reality (VR) can play a role in the participation of citizens in the urban planning\n                     process of the \u00d6n island in Ume\u00e5, Sweden. We present the designed experience and the\n                     qualities that play a role in using VR as a tool for participation in Urban Planning.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446069\">Textile Game Controllers: Exploring Affordances of E-Textile Techniques as Applied\n                  to Alternative Game Controllers<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Kate Hartman<\/li>\n               <li class=\"nameList\">Emma Westecott<\/li>\n               <li class=\"nameList\">Izzie Colpitts-Campbell<\/li>\n               <li class=\"nameList\">Jennie Robinson Faber<\/li>\n               <li class=\"nameList\">Yiyi Shao<\/li>\n               <li class=\"nameList\">Chris Luginbuhl<\/li>\n               <li class=\"nameList\">Olivia Prior<\/li>\n               <li class=\"nameList Last\">Manisha Laroia<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler93\" \/> \n            <label for=\"spoiler93\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Invested in increasing access to computational literacy, this paper explores the development\n                     of a series of free public workshops in partnership with an equity-seeking group.\n                     These workshops cover e-textile techniques that lend themselves to making alternative\n                     game controllers leading up to a concept-led game jam. We use research creation approaches\n                     to prioritize creative exploration within a community group for marginalized makers.\n                     The goal of the research is to explore and elucidate the overlap between e-textiles\n                     and experimental game making. We discuss our playful use of workshops as research\n                     method to iterate on the embodied experience of making on behalf of our participants.\n                     Our contribution maps the connections between workshop design and development, learning\n                     materials generated, through to application within an online game jam setting.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446070\">Designing Physical Artifacts for Tangible Narratives: Lessons Learned from Letters\n                  to Jos\u00e9<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Daniel Echeverri<\/li>\n               <li class=\"nameList Last\">Huaxin Wei<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler94\" \/> \n            <label for=\"spoiler94\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Digital tangible interfaces have been used to present or embed interactive stories\n                     for decades. While many researchers explored different ways of creating engaging interactive\n                     experiences, they focused more on the novelty of the interface and the new experience\n                     it brought for users, rather than the exploration of tangible interfaces as a language,\n                     or a medium, for storytelling. Through a visual journey of our creative explorations,\n                     this pictorial reflects upon the design and making of Letters to Jos\u00e9, a tangible\n                     narrative built during a three-year practice-led research project. Located in the\n                     intersection of tangible interaction design and interactive storytelling, our work\n                     characterizes the relationships between the artifact, other artifacts, the body, and\n                     the space with respect to supporting the narrative experience. The outcome is an annotated\n                     visual typology of artifacts for storytelling in the context of tangible narrative,\n                     as a design category and toolbox for researchers and creators of tangible stories.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446071\">Experiencing Distance: Wearable Engagements with Remote Relationships<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Janne Mascha Beuthel<\/li>\n               <li class=\"nameList\">Philippe Bentegeac<\/li>\n               <li class=\"nameList\">Verena Fuchsberger<\/li>\n               <li class=\"nameList\">Bernhard Maurer<\/li>\n               <li class=\"nameList Last\">Manfred Tscheligi<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler95\" \/> \n            <label for=\"spoiler95\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Living dislocated from family, friends or partners can result in negative emotions\n                     and a lack of physical, bodily closeness. We focus on materialising the negative consequences\n                     associated with living far apart in a textile form, and manifest those in two pairs\n                     of wearable artefacts: first, WARMTH, which simulates the bodily distance between\n                     remote people through a decrease in felt temperature, and second, BREATH, which embodies\n                     the bodily absence of a remote other through a decline in mechanical movement of textiles.\n                     The wearables are \u2018discussion artefacts\u2019 that enable conversation about, reflection\n                     on and exchange of personal, negative, vulnerable and challenging emotions connected\n                     to living far apart. The stance taken in this pictorial emphasises the necessity to\n                     focus not only on overcoming and bridging the distance between remote people through\n                     interactive artefacts; but also, to consider and manifest melancholic and possibly\n                     negative personal experiences.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446072\">Crafting a Leather Self-tracking Device for Pollen Allergies<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Minna Pakanen<\/li>\n               <li class=\"nameList\">Kasper Heiselberg<\/li>\n               <li class=\"nameList\">Troy Robert Nachtigall<\/li>\n               <li class=\"nameList\">Marie Broe<\/li>\n               <li class=\"nameList Last\">Peter Gall Krogh<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler96\" \/> \n            <label for=\"spoiler96\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>In this pictorial, we offer an overview of the design and step-by-step guidance on\n                     crafting a leather self-tracking device for pollen allergies. We designed the device\n                     to support patients\u2019 daily lives allowing them to track symptoms and their severity\n                     and medicine intake while on the go, multiple times per day. The self-tracker sends\n                     the data to an accompanying smartphone application to support patients in understanding\n                     their illness, be used in doctors\u2019 consultation, and collect data for allergy research.\n                     We chose natural leather as a material for this domestic medical device due to its\n                     associations with everyday artifacts, so it would be easier to integrate it into allergy\n                     patients\u2019 everyday lives.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446073\">BYOD &#8211; Bringing Your Own Device into Single-Surface Interaction Models<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Magdalena Boucher<\/li>\n               <li class=\"nameList\">Kerstin Blumenstein<\/li>\n               <li class=\"nameList\">Victor Adriel de Jesus Oliveira<\/li>\n               <li class=\"nameList Last\">Markus Seidl<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler97\" \/> \n            <label for=\"spoiler97\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Interactive installations face several challenges when used in public or semi-public\n                     contexts, such as users not noticing interactivity, or refraining from interacting\n                     due to the fear of social embarrassment. Several interaction models proposing solutions\n                     to these issues have been established over the years. However, most of these models\n                     focus on single-surface installations, which have to be approached for interaction.\n                     When the users\u2019 own devices (ODs) are included in the installation, though, the interaction\n                     focus shifts. To determine the influence of the ODs on the applicability of existing\n                     interaction models, we developed a multi-device quiz game, the \u201cWeisskunig Quiz\u201d,\n                     according to these models. The installation was deployed in a real museum exhibition\n                     and tested in two qualitative evaluations with random museum visitors and a school\n                     class. We describe our design approach, reflect on the found differences, and propose\n                     an extension for single-surface interaction models by incorporating the users\u2019 own\n                     devices.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446074\">Decoraction: a Catalogue for Interactive Home Decor of the Nearest-Future<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Sara Nabil<\/li>\n               <li class=\"nameList Last\">David Kirk<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler98\" \/> \n            <label for=\"spoiler98\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Home decor defines how people experience and share spaces, with the decorative elements\n                     forming the \u2018interface\u2019 to the home. Despite the opportunities of embedding technology\n                     within these elements, research to date has not explored this fully. This paper brings\n                     home decor to interaction design utilizing decorative elements as a vehicle to incorporate\n                     tangible interaction in domestic spaces. In an IKEA-like format, we designed a product\n                     catalogue of our own prototypes that illustrate the possibilities of the nearest future.\n                     These design illustrations should offer inspiration to those who wish to work with\n                     interactive materials (e.g. appearance-changing and soft-sensing), particularly in\n                     the context of interactive spaces. Through making, situating, and speculating, we\n                     show how designing interactive decor can be a promising area in Research-though-Design.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446075\">Always On:: Unpacking the challenges of living with insulin pumps, to design novel solutions<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Andy Harper<\/li>\n               <li class=\"nameList Last\">Leila Aflatoony<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler99\" \/> \n            <label for=\"spoiler99\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Insulin pumps are effective tools for the precise control of glucose levels, for type\n                     1 diabetes (T1D) patients. Unfortunately, many design and usability challenges still\n                     exist with these technologies. We investigated current shortcomings and limitations\n                     through survey (N=105), interview (N=7), and participatory workshop (N=3) data collection\n                     methods. Our findings revealed issues with current technolog y including wear-ability\n                     and accessibility in public, operation while performing demanding tasks, interruptions\n                     during social activities, continuity of maintenance, and interface operations. Using\n                     the data from our investigative work, we produced design criteria to develop a novel\n                     wrist-worn interface and separate pump design for a closed loop system. We then evaluated\n                     the design through remote usability testing sessions (N=7) with insulin pump users.\n                     Our study aspires to inform the future design of novel insulin pumps that enable people\n                     with T1D to maintain better control of their glucose through consistent interactions\n                     with these tools, during their everyday activities.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446076\">Crafting Stories: Smart and Electronic Textile Craftsmanship for Interactive Books<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList Last\">Irene Posch<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler100\" \/> \n            <label for=\"spoiler100\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>\u2018Crafting Stories\u2019 investigates the potentials of smart and electronic textile craftsmanship\n                     in the context of interactive books. A handmade interactive eTextile book, copying\n                     a handmade interactive textile book my late grandmother made, is presented as a prototypical\n                     artefact. The presentation of this practice-based research at the intersection of\n                     textile crafts, electronic and computational technology, and storytelling explicitly\n                     focuses on the interactive qualities, materials used, and the story experience. It\n                     introduces and contextualizes \u2018The Book My Grandmother Might Have Made\u2019, annotates\n                     design decisions, and pictures the making of the book. I report on early explorations\n                     of users interacting with the crafted stories and close with discussing the possibilities\n                     and perspectives of crafting tangible, embedded, and embodied interactions within\n                     (eTextile) books.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446077\">Sm\u00f6rg\u00e5sbords for Physical Computing<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Laurens Boer<\/li>\n               <li class=\"nameList Last\">Harvey Bewley<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler101\" \/> \n            <label for=\"spoiler101\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Physical computing concerns the design of systems that can sense and respond to the\n                     world around them, which is why it is often used in interaction design projects in\n                     educational settings. However, students who encounter physical computing for the first\n                     time are typically not aware of the form factors and the potential for interaction\n                     of the various sensing and actuating possibilities. To complement existing touchpoints\n                     that these students have with physical computing, we present electronic sm\u00f6rg\u00e5sbords:\n                     boards that display collections of physical computing components that are available\n                     inhouse in an organised and interactive way to support the initiation of interaction\n                     design projects. The development of the boards allowed us to articulate four principles\n                     for their design, which are intended to inspire the development of future educational\n                     material.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446078\">KINEIN: A Making Guide on Indefinitely Deploying a Kinetic Display as a Research Product<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Eleni Economidou<\/li>\n               <li class=\"nameList\">Moritz Kubesch<\/li>\n               <li class=\"nameList\">Alina Krischkowsky<\/li>\n               <li class=\"nameList\">Martin Murer<\/li>\n               <li class=\"nameList Last\">Manfred Tscheligi<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler102\" \/> \n            <label for=\"spoiler102\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>Large-scale kinetic displays have been present for decades, both as mesmerising art\n                     installations and as information displays. In this paper we present KINEIN, a kinetic\n                     display whose large-scale and indefinite deployment time distinguish it from the prototypical\n                     level of HCI research artefacts. It consists of 625 \u201ctiles\u201d or \u201cpixels\u201d mounted on\n                     a 25 x 25 matrix and operates through mechanical movement. Instead of a motor rotating\n                     each tile individually, KINEIN has a vertical column system that moves tiles sequentially.\n                     Situated at the entrance of a learning environment, KINEIN is not only a playful artefact\n                     for children to physically interact with, but also the first encounter for the school&#8217;s\n                     visitors.<\/p> \n                  <p>We make use of the pictorial format to illustrate and contribute our Research-through-Design\n                     (RtD) process as well as its implications regarding large physical scale and indefinite\n                     deployment.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t\n            \t\t\t\t\t\t\n            <h3><a class=\"DLtitleLink\" title=\"Full Citation in the ACM Digital Library\" href=\"https:\/\/dl.acm.org\/doi\/10.1145\/3430524.3446079\">Distributed Collaborative Sensemaking: Tracing a Gradual Process<\/a><\/h3>\n            <ul class=\"DLauthors\">\n               <li class=\"nameList\">Doroth\u00e9 Smit<\/li>\n               <li class=\"nameList\">Ruth Neubauer<\/li>\n               <li class=\"nameList Last\">Verena Fuchsberger<\/li>\n            <\/ul>\n            <input type=\"checkbox\"  id=\"spoiler103\" \/> \n            <label for=\"spoiler103\" >Abstract<\/label>\n            <div class=\"DLabstract spoiler\">\n               <div style=\"display:inline\">\n                  \t\t\n                  <p>This Pictorial discusses the outcomes of a distributed embodied ideation workshop.\n                     In this workshop, students of a Bachelor program on Management by Design explained\n                     their thesis by visualizing and externalizing the problem space using everyday objects.\n                     The use of objects to represent and understand complex design problems is well documented\n                     in the fields of Design and Human-Computer Interaction, but is often researched in\n                     collocated settings. What happens when the design tools used to externalize and represent\n                     complex design problems are everyday household objects, and the discussion happens\n                     in a distributed, asynchronous manner? How is shared meaning negotiated and established?\n                     In this Pictorial, we discuss six pictures that were taken of the visual representations\n                     that the students constructed, and reflect upon the distributed, asynchronous process\n                     that was employed to create, develop, and agree upon meaning.<\/p>\n                  \t<\/div>\n            <\/div>\n            \t\t\t\t\t\t\n            \t\t\t\t\t<\/div>\n      <\/div>\n   <\/body>\n<\/html>\n","protected":false},"excerpt":{"rendered":"<p>Proceedings of TEI &#8217;21: Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction TEI &#8217;21: Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction Full Citation in the ACM Digital Library SESSION: Full Papers UltraPower: Powering Tangible &amp; Wearable Devices with Focused Ultrasound Rafael Morales Gonz\u00e1lez Asier Marzo Euan Freeman William Frier Orestis Georgiou Abstract [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-1445","page","type-page","status-publish","hentry"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.5 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Proceedings &#8211; TEI 2021<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/tei.acm.org\/2021\/proceedings\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Proceedings &#8211; TEI 2021\" \/>\n<meta property=\"og:description\" content=\"Proceedings of TEI &#8217;21: Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction TEI &#8217;21: Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction Full Citation in the ACM Digital Library SESSION: Full Papers UltraPower: Powering Tangible &amp; Wearable Devices with Focused Ultrasound Rafael Morales Gonz\u00e1lez Asier Marzo Euan Freeman William Frier Orestis Georgiou Abstract [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/tei.acm.org\/2021\/proceedings\/\" \/>\n<meta property=\"og:site_name\" content=\"TEI 2021\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/teiconf\/\" \/>\n<meta property=\"article:modified_time\" content=\"2021-02-15T09:15:05+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dl.acm.org\/specs\/products\/acm\/releasedAssets\/images\/footer-logo1.png\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:site\" content=\"@tei_conf\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"75 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/proceedings\\\/\",\"url\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/proceedings\\\/\",\"name\":\"Proceedings &#8211; TEI 2021\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/proceedings\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/proceedings\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/dl.acm.org\\\/specs\\\/products\\\/acm\\\/releasedAssets\\\/images\\\/footer-logo1.png\",\"datePublished\":\"2021-02-15T09:15:03+00:00\",\"dateModified\":\"2021-02-15T09:15:05+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/proceedings\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/tei.acm.org\\\/2021\\\/proceedings\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/proceedings\\\/#primaryimage\",\"url\":\"https:\\\/\\\/dl.acm.org\\\/specs\\\/products\\\/acm\\\/releasedAssets\\\/images\\\/footer-logo1.png\",\"contentUrl\":\"https:\\\/\\\/dl.acm.org\\\/specs\\\/products\\\/acm\\\/releasedAssets\\\/images\\\/footer-logo1.png\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/proceedings\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Proceedings\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/#website\",\"url\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/\",\"name\":\"TEI 2021\",\"description\":\"The 15th ACM Conference on Tangible, Embedded and Embodied Interaction\",\"publisher\":{\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/#organization\",\"name\":\"TEI 2021 - The 15th ACM Conference on Tangible, Embedded and Embodied Interaction\",\"url\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/wp-content\\\/uploads\\\/2020\\\/08\\\/TEI-logo_variations_CC_finals-09.png\",\"contentUrl\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/wp-content\\\/uploads\\\/2020\\\/08\\\/TEI-logo_variations_CC_finals-09.png\",\"width\":960,\"height\":334,\"caption\":\"TEI 2021 - The 15th ACM Conference on Tangible, Embedded and Embodied Interaction\"},\"image\":{\"@id\":\"https:\\\/\\\/tei.acm.org\\\/2021\\\/#\\\/schema\\\/logo\\\/image\\\/\"},\"sameAs\":[\"https:\\\/\\\/www.facebook.com\\\/teiconf\\\/\",\"https:\\\/\\\/x.com\\\/tei_conf\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Proceedings &#8211; TEI 2021","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/tei.acm.org\/2021\/proceedings\/","og_locale":"en_US","og_type":"article","og_title":"Proceedings &#8211; TEI 2021","og_description":"Proceedings of TEI &#8217;21: Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction TEI &#8217;21: Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction Full Citation in the ACM Digital Library SESSION: Full Papers UltraPower: Powering Tangible &amp; Wearable Devices with Focused Ultrasound Rafael Morales Gonz\u00e1lez Asier Marzo Euan Freeman William Frier Orestis Georgiou Abstract [&hellip;]","og_url":"https:\/\/tei.acm.org\/2021\/proceedings\/","og_site_name":"TEI 2021","article_publisher":"https:\/\/www.facebook.com\/teiconf\/","article_modified_time":"2021-02-15T09:15:05+00:00","og_image":[{"url":"https:\/\/dl.acm.org\/specs\/products\/acm\/releasedAssets\/images\/footer-logo1.png","type":"","width":"","height":""}],"twitter_card":"summary_large_image","twitter_site":"@tei_conf","twitter_misc":{"Est. reading time":"75 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/tei.acm.org\/2021\/proceedings\/","url":"https:\/\/tei.acm.org\/2021\/proceedings\/","name":"Proceedings &#8211; TEI 2021","isPartOf":{"@id":"https:\/\/tei.acm.org\/2021\/#website"},"primaryImageOfPage":{"@id":"https:\/\/tei.acm.org\/2021\/proceedings\/#primaryimage"},"image":{"@id":"https:\/\/tei.acm.org\/2021\/proceedings\/#primaryimage"},"thumbnailUrl":"https:\/\/dl.acm.org\/specs\/products\/acm\/releasedAssets\/images\/footer-logo1.png","datePublished":"2021-02-15T09:15:03+00:00","dateModified":"2021-02-15T09:15:05+00:00","breadcrumb":{"@id":"https:\/\/tei.acm.org\/2021\/proceedings\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/tei.acm.org\/2021\/proceedings\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/tei.acm.org\/2021\/proceedings\/#primaryimage","url":"https:\/\/dl.acm.org\/specs\/products\/acm\/releasedAssets\/images\/footer-logo1.png","contentUrl":"https:\/\/dl.acm.org\/specs\/products\/acm\/releasedAssets\/images\/footer-logo1.png"},{"@type":"BreadcrumbList","@id":"https:\/\/tei.acm.org\/2021\/proceedings\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/tei.acm.org\/2021\/"},{"@type":"ListItem","position":2,"name":"Proceedings"}]},{"@type":"WebSite","@id":"https:\/\/tei.acm.org\/2021\/#website","url":"https:\/\/tei.acm.org\/2021\/","name":"TEI 2021","description":"The 15th ACM Conference on Tangible, Embedded and Embodied Interaction","publisher":{"@id":"https:\/\/tei.acm.org\/2021\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/tei.acm.org\/2021\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/tei.acm.org\/2021\/#organization","name":"TEI 2021 - The 15th ACM Conference on Tangible, Embedded and Embodied Interaction","url":"https:\/\/tei.acm.org\/2021\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/tei.acm.org\/2021\/#\/schema\/logo\/image\/","url":"https:\/\/tei.acm.org\/2021\/wp-content\/uploads\/2020\/08\/TEI-logo_variations_CC_finals-09.png","contentUrl":"https:\/\/tei.acm.org\/2021\/wp-content\/uploads\/2020\/08\/TEI-logo_variations_CC_finals-09.png","width":960,"height":334,"caption":"TEI 2021 - The 15th ACM Conference on Tangible, Embedded and Embodied Interaction"},"image":{"@id":"https:\/\/tei.acm.org\/2021\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/teiconf\/","https:\/\/x.com\/tei_conf"]}]}},"_links":{"self":[{"href":"https:\/\/tei.acm.org\/2021\/wp-json\/wp\/v2\/pages\/1445","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/tei.acm.org\/2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/tei.acm.org\/2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/tei.acm.org\/2021\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/tei.acm.org\/2021\/wp-json\/wp\/v2\/comments?post=1445"}],"version-history":[{"count":19,"href":"https:\/\/tei.acm.org\/2021\/wp-json\/wp\/v2\/pages\/1445\/revisions"}],"predecessor-version":[{"id":1464,"href":"https:\/\/tei.acm.org\/2021\/wp-json\/wp\/v2\/pages\/1445\/revisions\/1464"}],"wp:attachment":[{"href":"https:\/\/tei.acm.org\/2021\/wp-json\/wp\/v2\/media?parent=1445"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}