TEI ’21: Fifteenth International Conference on Tangible, Embedded, and Embodied InteractionFull Citation in the ACM Digital Library
SESSION: Full Papers
Wireless power transfer creates new opportunities for interaction with tangible and wearable devices, by freeing designers from the constraints of an integrated power source. We explore the use of focused ultrasound as a means of transferring power to a distal device, transforming passive props into dynamic active objects. We analyse the ability to transfer power from an ultrasound array commonly used for mid-air haptic feedback and investigate the practical challenges of ultrasonic power transfer (e.g., receiving and rectifying energy from sound waves). We also explore the ability to power electronic components and multimodal actuators such as lights, speakers and motors. Finally, we describe exemplar wearable and tangible device prototypes that are activated by UltraPower, illustrating the potential applications of this novel technology.
There has been a growing interest in interactive visuals in contemporary dance performances. These visuals often rely on embodied interaction techniques, such as motion capture or biosignal sensors. However, there is a lack of research into how audience members experience these interactive visuals, and how to enhance that experience. We conducted an audience study, involving four different dance performances. Each of the performances explored a different approach for interaction involving visuals. We collected data from audience members, regarding their experience of the performances, using questionnaires and interviews. The analysis of this data allows us to identify implications for design: balancing trade-offs within a mapping clarity spectrum; connecting layers; visuals as co-creative mediator; defined territory and individuality of visuals; exploration of perspective shift and abstract fragmentation. We argue that these considerations are relevant for designers of systems visualizing embodied interaction, not just for dance, but also for other related applications.
YOU BETTA WERK: Using Wearable Technology Performance Driven Inclusive Transdisciplinary Collaboration to Facilitate Authentic Learning
Working or WERKing on a wearable technology project in a transdisciplinary group can be an effective way of learning new skills and collaboration techniques. This paper describes a case study of running a wearable technology group project within an undergraduate course entitled Wearable Technology and Society. The computational media students in the class collaborated with outside performance artists (drag queens and a street dancer) to create interactive performance garments. Design methods such as the use of boundary objects aided in communication of ideas and cooperation across disciplines and cultural barriers. The requirement that the interactive garment function appropriately in a real performance lent urgency and gravity to the experience, motivating cohesive and expedited problem solving in the transdisciplinary group. The use of these methods on a project with real world outcomes and consequences facilitated an authentic learning experience for the students involved.
VirtualWire: Supporting Rapid Prototyping with Instant Reconfigurations of Wires in Breadboarded Circuits
Assembling circuits is a challenging and time consuming activity for novice makers, frequently resulting in incorrect placements of wires and components into breadboards. This results in errors that are difficult to identify and debug, and delays that hinder creating, exploring or reconfiguring circuit layouts. This paper presents VirtualWire, a tool that allows users to rapidly design and modify circuits in software and have these changes instantiated in real-time as electrical connections on a physical breadboard. To achieve this, VirtualWire dynamically translates circuit design files into physical connections inside a hardware switching matrix, which handles wiring across breadboard rows and to/from an embedded Arduino. The user can interactively test, tune, and share different circuit layouts for an Arduino shield, and once satisfied, can fabricate the circuit on a permanent substrate. Quantitative and qualitative user studies demonstrate that VirtualWire significantly reduces the time taken for (by 37%), and the number of errors made during (by 53%) circuit assembly, while also supporting users in creating readable, space-efficient and flexible layouts.
This paper argues that a significant paradigm change in contemporary dance can offer further opportunities for HCI researchers interested in embodied interaction and interactive system design. Based on the analysis of 42 HCI papers in our data set, resulting from searches in two computing research libraries, we suggest seven thematic categories that reflect how HCI researchers have been engaging with contemporary dance. Moreover, we propose a standardized usage of contemporary dance terminology in HCI literature, and discuss the current state of engagement with publications from the field of performance theory. We identify three opportunities for HCI, which can arise through further engagement with the knowledge produced in contemporary dance and performance: to engage with the field of embodied interaction from the perspective of performance research and theory; to employ contemporary dance methods and practices in HCI research; and to integrate contemporary dance choreographers and performers as researchers in interdisciplinary projects.
Concrete is a ubiquitous material in urban environments and increasingly used by industry and the maker movement. However, there is little research about its affordance and its potential for embedded User Interfaces (UI). In our ongoing work, we investigate different manufacturing processes and design strategies to change and adapt the affordances of concrete to make it appear interactive. We tested three interface elements, a button, a scroll wheel, and a slider, with 33 participants in a lab elicitation study. Each was created in two versions following two design strategies, one with a more natural look, the other more abstract. Five participants then bodystormed ideas with the prototypes in an outdoor environment. Based on our explorations, we discuss design considerations for creating concrete interfaces including the potential of both design strategies and present different application scenarios.
Up Close & Personal: Exploring User-preferred Image Schemas for Intuitive Privacy Awareness and Control
Effective end-user privacy management in everyday ubiquitous computing environments requires giving users complex, contextual information about potential privacy breaches and enabling management of these breaches in a timely, engaging and intuitive manner. In this paper, we propose using empirically grounded image schema-based metaphors to help design these interactions. Results from our exploratory user study (N=22) demonstrate end users’ preferences for changes in physical attributes and spatial properties of objects for privacy awareness. For privacy control, end users prefer to exert force and create spatial movement. The study also explores user preferences for wearable vs. ambient form-factors for managing privacy and concludes that a hybrid solution would work for more users across more contexts. We thus provide a combination of form factor preferences, and a focused set of image schemas for designers to use when designing metaphor-based tangible privacy management tools.
Given the material nature of data physicalization, their creators need to make many design decisions, including material choices and scale. Our study explores the impact of scale in physicalization, motivated by the assumption that size can affect user experience. We created two different physicalizations (for the same dataset) in three sizes each, and evaluated the resulting six objects with a questionnaire approach and interviews. Our findings highlight that scale needs to be chosen wisely given its impact on representation legibility (ease of viewing and understanding) and affordances for interaction. We discuss factors to take into account when designing large-scale physicalizations and in further research on the potential role of scale in physicalizaton. In particular, we argue that for large-scale physicalizations, scale should matter and communicate meaning, for instance, supporting an intuitive understanding of magnitudes, or a specific experience. Thus, scale needs to be an explicit design decision, that interacts with other design parameters.
We present StringTouch, a user interface design exploration translating the expressive resource of string instruments to a new interface morphology. StringTouch transfers the string as a tactile element of interaction to the touch surface, resulting in a tactilely experienceable interface. In this paper we discuss our research through design centered approach, which focused on the exploration of musical string instruments and their translation to the UI context. To investigate this specific design space, we analyzed the systematic and handling of string instruments as well as common HCI principles to develop the interaction concept. The resulting experience prototype demonstrates the idea’s potential for haptic UI design and provides insights into the prototyping process. We present: (1) the investigation of string instruments as a resource for TUI design and (2) the transfer to a generic UI context to inform new hybrid interface morphologies that combine features of tangible, touch, and flexible interaction.
Motion capture technology is widely used in movement-related Human-Computer Interaction, especially in digital arts such as digital dance performance. This paper presents a knit stretch sensor-based dance leotard design to evaluate the locations where the sensors best capture the movement on the body. Two studies are undertaken: (1) interviews to determine user requirements of a dance movement sensing system; (2) evaluation of sensor placement on the body. Ten interviewees including dancers, choreographers, and technologists describe their requirements and expectations for a body movement sensing system. The centre of the body (the torso) is determined to be the area of primary interest for dancers and choreographers to sense movement, and technologists find the robustness of textile sensors the most challenging for textile sensing system design. A dance leotard toile is then designed with sensor groupings on the torso along the direction of major muscles, based on the interviewees’ preferred movements to be captured. Each group of the sensors are evaluated by comparing their signal output and a Vicon motion capture system. The evaluation shows sensors which are constantly under tension perform better. For example, sensors on the upper back have a higher success rate than the sensors on the lower back. The dance leotard design was found to capture the movements of standing lean back and standing waist twists the best.
We introduce Soft Speakers, a systematic approach for designing custom fabric actuators that can be used as audio speakers and vibro-haptic actuators. Digitally-embroidered with e-textiles, we implement Soft Speakers as tactile, malleable and aesthetic designs to be part of wearables, soft furnishing and fabric objects. We present a rapid technique for the DIY fabrication of audio feedback into soft interfaces. We also discuss and evaluate 7 factors for their parametric design in additive and constructive methods. To demonstrate the feasibility of our approach and the breadth of new designs that it enables, we developed 5 prototypes: 3 wearables, a piece of furniture and a soft toy. Studying Soft Speakers with maker-users expanded the design space, empowering users and supporting inclusive design. Our study includes insights on user experience of real-world interactive applications for remote communication, e-learning, entertainment, navigation and gaming, enabled by Soft Speakers’ customizable and scalable form factor.
Recommender algorithms play an active role in many everyday activities. However, personalized recommendations often produce negative experiences due to a lack of awareness, control, or transparency. Allowing users to materialize their algorithmic imaginaries exposes how they experience, perceive, and imagine recommender algorithms. Moreover, it can unearth novel and previously unattended design opportunities for tangible interactions with algorithms. Therefore, we explored how 15 users of a famous movie recommender system materialized tangible designs to reflect and discuss their algorithmic imaginaries during co-design workshops and interviews. Using thematic analysis, we identified two forms of algorithmic imaginaries that can inspire tangible interactions with recommender algorithms: metaphoric and datafied representations. Complementary themes exposed the influence of contextual factors and diverse negative attitudes towards personalized movie recommendations. Based on these findings, we suggest design opportunities and suggestions for improving the algorithmic experience of movie recommendations and similar systems through tangible user interfaces.
Research was conducted on a digital information display using electrolysis bubbles. Although the research mainly focused on information displays in daily life, the ephemerality of bubbles is also a promising method for dynamic art installations. In this paper, we present two novel artworks using this electrolysis bubble display mechanism. First, we present “UTAKATA,” a ticker-like bubble display, using a running-water channel. Seven electrodes are placed linearly on the channel bed, and they generate text messages using bubble dots that drift toward the lower end of the channel. Second, we present the “Bubble Mirror,” which is a water pan with a camera that captures a visitor’s face and displays it using electrolysis bubbles as pixels. Facial images with six levels of grayscales are displayed on the water surface using 32 × 32 electrodes. We evaluated the output properties of these configurations and discuss the results obtained.
As technology is integrated into all aspects of our lives, researchers are exploring emerging technologies in the field of fashion design. The substantial growth in the field of functional apparel design, such as the use of smart textiles, encourages researchers and fashion designers to incorporate technology within their designs. In much previous work, e-textiles have required interdisciplinary knowledge such as electrical engineering and computer science to be successful. To help with this we created ready-to-use shape changing fabric samples for fashion designers. We explored this research gap through a preliminary user case study with seven experienced designers. Our results suggest design approaches for shape changing fabric samples that would assist non-technically skilled designers incorporating technology in their designs.
Softness is one of the most important factors in human tactile perception. With recent advances in 3Dprinting, there has been significant progress in fabricating compliant objects. However, existing methods typically remain inaccessible to end-users, mainly due to the separation between designing shapes and setting printing parameters to achieve desired softness, resulting in the exclusion of its customization in early design processes. In this work, we contribute an end-to-end design tool that takes a design-by-example approach: given a 3D model, a user can specify the region of interest and a level of softness, by shopping everyday objects as a reference. The tool then generates both geometry and 3D printing parameters to reproduce the desired softness, which can be fabricated using low-cost FDM 3D printing and materials for it. We also provide a data-driven pipeline to enable other compliance modeling methods to be generalized within our design tool. In two user studies, we demonstrated that users could easily locate existing reference objects’ softness to a 3D printed object. In a design session, end-users successfully used OmniSoft to design augmented functions.
We present two proof-of-concept narrative VR experiences with a focus on sound-based physical interactions. Responding to a call to expand upon current design conceptualizations, we draw on tangible sound-based design in order to develop considerations for the body and physical environments within VR narratives. We propose that a focus on the actions the player is asked to perform (e.g., touch, stand, kneel, grasp, walk, listen, reach, dance) can contribute to an understanding of VR as a sensory, embodied medium that offers ways to playfully engage with physical reality rather than simulate it entirely.
Can Physical Tools that Adapt their Shape based on a Learner’s Performance Help in Motor Skill Training?
Adaptive tools that can change their shape to support users with motor tasks have been used in a variety of applications, such as to improve ergonomics and support muscle memory. In this paper, we investigate whether shape-adapting tools can also help in motor skill training. In contrast to static training tools that maintain task difficulty at a fixed level during training, shape-adapting tools can vary task difficulty and thus keep learners’ training at the optimal challenge point, where the task is neither too easy, nor too difficult.
To investigate whether shape adaptation helps in motor skill training, we built a study prototype in the form of an adaptive basketball stand that works in three conditions: (1) static, (2) manually adaptive, and (3) auto-adaptive. For the auto-adaptive condition, the tool adapts to train learners at the optimal challenge point where the task is neither too easy nor too difficult. Results from our two user studies show that training in the auto-adaptive condition leads to statistically significant learning gains when compared to the static (F1, 11 = 1.856, p < 0.05) and manually adaptive conditions (F1, 11 = 2.386, p < 0.05).
Understanding the First Person Experience of Walking Mindfulness Meditation Facilitated by EEG Modulated Interactive Soundscape
Walking meditation is a form of mindfulness training, where the act of walking provides a rhythmic meter for attentional focus. Whilst digital technologies to support sitting meditation and walking practices exist, less explored is the first person in-the-moment experience of technology-mediated walking meditation. We present a study of group walking meditation, with and without an interactive rhythmic soundscape modulated by one practitioner’s brainwave data. Six workshops were conducted with novice and advanced practitioners, involving a guided walking meditation with body scan, writing and drawing exercises and a group interview. The analysis yielded themes of shifting state, attention, self-regulation strategy, and immersion and reflection, and insights into how practitioners use sound to synchronize both walking and breathing. We contribute a method for eliciting, and a novel description of, the first person experience of walking meditation, as resources for the design of interactive technologies to support mindfulness practices of walking meditation.
Machine Learning (ML) is often used invisibly in everyday applications with little opportunity for consumers to investigate how it works. In this paper, we expand recent efforts to unfold what students should know about ML and how to design tools and activities allowing them to engage with ML. To do so, we explore how to make processes and aspects of ML tangible through the design of the Machine Learning Machine (MLM); a tangible user interface which enables students to create their own data-sets using pen and paper and to iteratively build and test ML models using this data. Based on insights from the design process and a preliminary pilot study with the MLM, we discuss how a tangible approach to engaging with ML can spur curiosity in students and how the iterative process of improving ML models can encourage students to reflect on the relation between data, model and predictions.
We follow up on a prominent line of work in which principles of embodied cognition are employed to not only account for skilled coping but also for more intellectual activities such as remembering and imagination. Imagination then, is not a reflective activity an individual does by herself, but a shared and embodied activity scaffolded by tangible design. We present a case study in which we designed a toolkit to facilitate imagining the Netherlands in 2050. We wrote speculative stories of people living in 2050 and designed an assortment of objects. We held several workshops to use the toolkit for shared imagination for our client, Rijkswaterstaat. We analyze how, in the context of the workshops, the stories and objects provided affordances for shared imagination. We thereby hope to have demonstrated that it is possible to design for more intellectual activities in a tangible and embodied way.
Punch-Sketching E-textiles: Exploring Punch Needle as a Technique for Sustainable, Accessible, and Iterative Physical Prototyping with E-textiles
Tangible toolkits enable individuals to explore concepts through combining components together and taking them apart. The strength and limitation of many e-textile toolkits is that threads hold them in place, and once put together they need destructive methods to take them apart. In this paper, we propose Punch-Sketching e-textiles, a drawing technique that uses a punch needle to iteratively prototype soft circuits. The benefits of this approach is sustainability and reusability where users can easily pull out circuits without damaging the materials or creating waste, while also testing out concepts using the actual threads that will be used in the final prototype. To validate our technique, we ran three studies comparing sewing and punching e-textiles through: 1) Understanding the process with two fiber artists; 2) Exploring the potential with four beginner users; and 3) Utilizing our methods further with 10 occupational therapists. Insights from these three studies include when and how to use each method, toolkit recommendations, considerations for iterative physical prototyping, sustainability, and accessibility.
Through experience, the techniques used by professional vocalists become highly ingrained and much of the fine muscular control needed for healthy singing is executed using well-refined mental imagery. In this paper, we provide a method for observing intention and embodied practice using surface electromyography (sEMG) to detect muscular activation, in particular with the laryngeal muscles. Through sensing the electrical neural impulses causing muscular contraction, sEMG provides a unique measurement of user intention, where other sensors reflect the results of movement. In this way, we are able to measure movement in preparation, vocalised singing, and in the use of imagery during mental rehearsal where no sound is produced. We present a circuit developed for use with the low voltage activations of the laryngeal muscles; in sonification of these activations, we further provide feedback for vocalists to investigate and experiment with their own intuitive movements and intentions for creative vocal practice.
E-textiles, which embed circuitry into textile fabrics, blend art and creative expression with engineering, making it a popular choice for STEAM classrooms [6, 12]. Currently, e-textile development relies on tools intended for traditional embedded systems, which utilize printed circuit boards and insulated wires. These tools do not translate well to e-textiles, which utilize fabric and uninsulated conductive thread. This mismatch of tools and materials can lead to an overly complicated development process for novices. In particular, rapid prototyping tools for traditional embedded systems are poorly matched for e-textile prototyping. This paper presents the ThreadBoard, a tool that supports rapid prototyping of e-textile circuits. With rapid prototyping, students can test circuit designs and identify circuitry errors prior to their sewn project. We present the design process used to iteratively create the ThreadBoard’s layout, with the goal of improving its usability for e-textile creators.
Prototyping electronic circuits is often facilitated by web-based tutorials and breadboards. Several virtual and hybrid platforms do exist, each carrying their own limitations. Some of these platforms fall prey to split-attention effects, wherein users are required to split their attention to integrate multiple sources of spatially separated information. This hinders the learning and prototyping processes. Other platforms provide a single source of information, but lack tangible interaction with electronic components or suffer from the absence of active feedback which can also hinder these processes. There is hence a need for prototyping platforms that mitigate split attention effects, while continuing to provide other desirable aspects such as tangible interaction. To address this, we present three editions of PPCards, a card-based platform for prototyping electronic circuits, towards overcoming limitations of existing paradigms. Through a comparative study, it was determined that the first edition of PPCards outdid the conventional breadboard web-based tutorial paradigm in aspects of split attention, usability, and user experience. The second and third editions build upon successful characteristics of the first, additionally provisioning support for multimedia content and real-time feedback during the prototyping process. Based on quantitative data and qualitative feedback, we go on to discuss design considerations for future tangible card-based tools.
Empathy tools are experiences designed to evoke empathetic responses by placing the user in another’s lived and felt experience. The problem is that designers do not have a common vocabulary to describe empathy tool experiences; consequently, it is difficult to compare/contrast empathy tool designs or to think about their efficacy. To address this problem, we analyzed 26 publications on empathy tools to develop a descriptive framework for designers of empathy tools. Based on our analysis, we found that empathy tools can be described along three dimensions: (i) the amount of agency the tool allows, (ii) the user’s perspective while using the tool, and (iii) the type of sensations that are experienced. We show that this framework can be used to describe a wide variety of empathy tools and provide recommendations for empathy tool designers, as well as techniques for measuring the efficacy of an empathy tool experience.
We present Seedmarkers, shape-independent topological markers that can be embedded in physical objects manufactured with common rapid-prototyping techniques. Many markers are optimized for technical performance while visual appearance or the feasibility of permanently merging marker and physical object is not considered. We give an overview of the aesthetic properties of a wide range of existing markers and conducted a short online survey to assess the perception of popular marker designs. Based on our findings we introduce our generation algorithm making use of weighted Voronoi diagrams for topological optimization. With our generator, Seedmarkers can be created from technical drawings during the design process to fill arbitrary shapes on any surface. Given dimensions and manufacturing constraints, different configurations for 3 or 6 degrees of freedom tracking are possible. We propose a set of application examples for shape-independent markers, including 3D printed tangibles, laser cut plates and functional markers on printed circuit boards.
“Can you help me move this over there?”: training children with ASD to joint action through tangible interaction and virtual agent
New technologies for autism focus on the training of either social skills or motor skills, but not both. Such a dichotomy omits a wide range of joint action tasks that require the coordination of two persons (e.g. moving a heavy furniture). The training of these physical tasks performed in dyad has great potential to foster inclusiveness while having an impact on both social and motor skills. In this paper, we present the design of a tangible and virtual interactive system for the training of children with Autism Spectrum Disorder (ASD) in performing joint actions. The proposed system is composed of a virtual character projected onto a surface on which a tangible object is magnetized: both the user and the virtual character hold the object, thus simulating a joint action. We report and discuss preliminary results of a field training study, which shows the potential of the interactive system.
Using Robotics and A.I. to Physically Explore a Space of Aesthetic Possibilities: Defining a Physical Aesthetic Experience by the Targeted EEG Feedback of the Perceiver
Aesthetic perception and cognition processes are highly individual dynamic processes due to their dependency on the emotional affective state, perceptual analysis, memory, context, and cognitive mastering. Therefore, an aesthetic experience will always be perceived differently for every person and time. With our approach, we research the potential to define an adaptive physical aesthetic experience by the targeted EEG feedback of the perceiver.
In a series of three distinct projects, we use generative robotic control (KUKA|prc), a Deep Convolutional Generative Adversarial Network, and electroencephalography (EEG) to create an aesthetic adaptation strategy within a physical parametric output space.
Though certain physical limitations apply, the resulting artefact interaction offers the potential to make the aesthetic definition to a certain extend relational to individual perception and cognition processes and, therefore, to some extent adaptive to emotional, contextual, and cultural change over time.
We introduce the term ‘Art Digital Jewellery’ as a label for craft-oriented, bespoke approaches to embedding electronics in jewellery. These unconventional digital-physical jewellery practices struggle for attention compared with higher profile, often more mass-production oriented wearables. This is partly because discourses articulating and critiquing these experimental practices are scarce and obscure to HCI researchers. To address this, we describe how these artistic practices arose from earlier fashion movements and we engaged six leading creative practitioners in a structured and iterative dialogue. Analysis of our adapted Delphi survey suggests that core to Art Digital Jewellery is very individualised design processes and creating artefacts which are highly personal in terms of their form, their materials, their narratives and their interactivity. An appreciation of these unique practices may enrich perspectives on designing wearables, marrying craft with technology, and personalisation of experiences.
Aerial robots such as quadrotors enjoy ever-increasing popularity and emerge in everyday applications that require user interaction. At immediate proximity, physical control of the quadrotor by touch may be desired or even necessary. In this paper we present a tactile 3D touch interaction scenario with a quadrotor by introducing virtual buttons whose operation is detected in the accelerometer data of the built-in Inertial Measurement Unit (IMU) of the quadrotor. By dispensing with additional sensors, we are able to keep the size of the used quadrotor to a minimum and thus address the problem of users being discouraged from interaction with quadrotors at immediate proximity. As an example for the proposed interaction scenario, we introduce MetroDrone, a quadrotor responding to repeated user taps to virtually defined buttons by flying trajectories according to the beat and operated button. This introduces a minimalist interaction technique that requires no intermediary devices and strengthens human-robot connections through shared musical experience.
Flexi Card Game: A Design Toolkit for Unconventional Communication Systems for Long-Distance Relationships
Mainstream communication technology today enables us to connect with loved ones across long distances. However, even with the proliferation of easy-to-use technology, there are reports on increasing loneliness and isolation amongst people. The technologies that enable these interactions are not necessarily designed to support meaningful emotional communication for users in long-distance relationships (LDRs). Furthermore, there is a lack of participatory tools that are designed with the focus of supporting end-user involvement in the design of LDR communication systems. We developed the Flexi Card Game (FCG), a card-based generative design toolkit to support designers and non-designers within participatory structures that can help develop unconventional communication systems to support LDRs. The FCG was developed using an iterative design process, involving end-users, designers, and researchers across five workshops with 56 participants. The paper makes three main contributions: (1) presents FCG—the novel card-based design toolkit itself, (2) describes the process of developing an LDR framework into a participatory toolkit, and (3) offers lessons and insights that can help researchers who are developing participatory tools in similar contexts.
In the few decades since the first mainframe computers, computing technologies have grown smaller, and more pervasive, moving onto and even inside human bodies. Even as those bodies have received increased attention by scholars, designers, and technologists, the bodily expectations and understandings articulated by these technological artefacts have not been a focus of inquiry in the field. I conducted a feminist content analysis on select papers in the proceeding of the ACM International Conference on Tangible, Embedded and Embodied Interaction (TEI) since its inception in 2007. My analysis illustrates how artefacts are implicitly oriented on unmarked bodily norms, while technologies designed for non-normative bodies treat those as deviant and in need of correction. Subsequently, I derive a range of provocations focused on material bodies in embodied interaction which offer a point of reflection and identify potentials for future work in the field.
Many existing explorations of wearables for HCI consider functionality first and wearability second. Typically, as the technologies, designs, and experiential understandings develop, attention can shift towards questions of deployment and wearability. To support this shift of focus we present a case study of the iterative design of electrode sleeves. We consider the design motivations and background that led to the existing, prototype EMS sleeves, and the resultant challenges around their wearability. Through our own design research practice, we seek to reveal design criteria towards the wearability of such a sleeve, and provide designs that optimise for those criteria. We contribute (1) new electrode sleeve designs, which begin to make it practicable to take EMS beyond the lab, (2) new fabrication processes that support rapid production and personalisation, and (3) reflections on criteria for wearability across new eTextile garments.
Boiling Mind: Amplifying the Audience-Performer Connection through Sonification and Visualization of Heart and Electrodermal Activities
In stage performances, an invisible wall in front of the stage often weakens the connections between the audience and performers. To amplify this performative connection, we present the concept ”Boiling Mind”. Our design concept is based on streaming sensor data related to heart and electrodermal activities from audience members and integrating this data into staging elements, such as visual projections, music, and lighting. Thus, the internal states of the audience directly influence the staging. Artists can have a more direct perception of the inner reactions of audience members and can create physical expressions in response to them. In this paper, we present the wearable sensing system as well as design considerations of mapping heart and electrodermal activity to changes in the staging elements. We evaluated our design and setup over three live performances.
We present Flowcuits, a DIY fabrication method to prototype tangible, interactive and functional electrical components by manipulating liquid metals. The prototypes afford both physical and visual interactions to demonstrate the inner working mechanics of fundamental electronic elements, which enables tangible and playful learning. The fabrication process follows simple imprinting and sealing of fluidic circuits with a 3D-printed stamp on an accessible moldable-substrates such as ‘Blu Tack’. Utilizing conductive gallium indium liquid metal, we demonstrated interactive and re-configurable electronic components such as switches, variable resistors, variable capacitors, logic gates and pressure sensors. In this paper, we present the design analogy of Flowcuits, DIY fabrication approach including a parametric 3D stamp design toolkit and results from a technical evaluation. The stamps are printed with a low-cost 3D printer and all the materials are inexpensive and reusable, enabling Flowcuits to be easily used without any advanced lab facilities.
There is an important gap in Assistive Technology related to the multiple capabilities of exploratory creativity, as with multi-aesthetic and inter-mediatic surfaces, especially in the design and development of child prosthetics. This in-progress research explores creative interactions and combinations of art and digital fabrication to design playful prosthetics for children’s upper limbs. In order to do so, we are using feline animal life as inspiration and as a ludic premise to engagement through playfulness. With this artistic exploration in the scope of tangible and embodied interfaces, we seek to contribute to improve and incorporate engagement and fantasy through playful interfaces, ultimately transforming a perceived difference into empowerment of a child’s self-esteem, awareness and imagination.
Tangibles are small, graspable objects that act as input devices or physical representations of digital data. Oftentimes, it is desirable to track the position of tangibles on a surface and their relation to each other. However, outside-in tracking techniques – such as capacitive touchscreens or cameras – require setting up elaborate infrastructure and are prone to occlusion or interference. We propose Dothraki, an inside-out tracking technique for tangibles on flat surfaces. An optical mouse sensor embedded in the tangible captures a small (36×36 pixel / 1×1 mm), unique section of a black-and-white De-Bruijn dot pattern printed on the surface. Our system efficiently searches the pattern space in order to determine the precise location of the tangible with sub-millimeter accuracy. Our proof-of-concept implementation offers a recognition rate of up to 95%, robust error detection, an update rate of 14 Hz, and a low-latency relative tracking mode.
Arpilleras Parlantes: Designing Educational Material for the Creation of Interactive Textile Art Based on a Traditional Chilean Craft.
Electronic Textiles (eTextiles) and open source material has been entangled with “DIY” movements and hands on work to create new interactive interfaces. Studies have shown the potential these have to bring computational knowledge closer to new audiences. We address how eTextiles can approach a traditional textile art, specifically “Arpilleras” from Chile, to understand how each field contributes from a symbolic, material and technical perspective. Following participatory approach methods and STEAM guidelines to create an educational program, we observed how eTextiles can influence the creation of soft interactive interfaces that enhance the communicative character and cultural heritage of a craft, and the potential of their use in a pedagogical context by using a specially designed kit. Our study suggests that connecting technology with strong cultural identity craft, can help to reach new audiences, revitalize the traditional technique, and create new tools for expression and creativity.
When digital applications aim to blend virtual and real worlds, understanding the actual physical actions of users becomes an important task; the precise timing of these tangible interaction events is needed, along with the identity, and possibly location and history, of all involved actors/objects. With multiple actors or objects, it is difficult to identify who touches which object and when. Instrumenting objects for Body Channel Communication (BCC) allows message exchange around the human body between instrumented objects and the user themselves. In this paper we show how BCC can be utilized to perform under real-time conditions so that we can directly notice touch events (and the identity of actors). TangibleID is a framework that unifies tangible interaction capture for objects and users based on wearable BCC. TangibleID provides identification and communication with tagged objects/users in less than 120 ms and supports a variety of tangible interactions, without the need to restrict user (hand) movements or to maintain line-of-sight connection to cameras. When an AR application is combined with TangibleID, a new tangible mixed reality experience is achieved, as demonstrated in the “Haunted Castle” showcase. The paper presents an end-to-end technical evaluation including trade-offs regarding robustness and speed of touch recognition, outlines the breadth of interaction modalities, and reports on an initial user assessment.
Supporting Practice-based vocational skills development in the classroom can be logistically challenging, however tangible interfaces present natural affordances for supporting such skill development. In the context of teaching stage lighting specifically, we present an open source toolkit based around tangible hardware proxies to support the teaching of practice-based skills. Our tangible proxies embody key configuration, interaction and optical properties of real stage equipment. Drawing on notions of representation and bi-directional digital-to-physical transformation, we design a toolkit that specifically supports a gradual transition between virtual, simulated and real equipment during the learning journey, and open the door to embedding stage craft into schools. Through reporting on deployments in two schools, we discuss the affordances of such proxies and their potential for supporting the teaching and learning of practice-based skills in the classroom.
SESSION: Work in Progress
We investigate whether presenting data in a VR visualization or as a physicalization impacts understanding and recollection. Two equivalent representations of the same data set, one in physical form and one in VR, were created. Participants answered understanding questions while they had access to the model, and were subsequently asked about the data after the model was removed. We recorded time needed to answer understanding questions and correctness rates for recollection questions. The results favour the conclusion that the virtual representation and the technical VR setup significantly inhibit participants’ ability to work with the data set. Reflecting on our study setup and participants’ comments, we discuss recommendations for future studies aiming at a systematic and comprehensive comparison of the differences in interacting with purely virtual and with physical data representations.
We propose inDepth, a novel system that enables force-based interaction with objects beyond a physical barrier by using scalable force sensor modules. inDepth transforms a physical barrier (eg. glass showcase or 3D display) to a tangible input interface that enables users to interact with objects out of reach, by applying finger pressure on the barrier’s surface. To achieve this interaction, our system tracks the applied force as a directional vector by using three force sensors installed underneath the barrier. Meanwhile, our force-to-depth conversion algorithm translates force intensity into a spatial position along its direction beyond the barrier. Finally, the system executes various operations on objects in that position based on the type of application. In this paper, we introduce inDepth concept and its design space. We also demonstrate example applications, including selecting items in showcases and manipulating 3D rendered models.
NOW YOU SEE ME, NOW YOU DON’T: revealing personality and narratives from playful interactions with machines being watched
The complexity of tasks and computations undertaken by machines has grown exponentially. In order to communicate these complexities to humans working with machines, these machines must be programmed to express themselves in forms that humans can understand, not merely with procedures and numbers, but with intentions, expressive gestures that communicate emotion, and interactions that form stories, all of which can be more intuitively grasped by humans. I explored machine gestures for communicating personality and narratives by building a “shy” lamp that looks away when humans gaze closely, and follows human faces when they’re far away. I followed by building a group of machines that all direct their gaze at the human unless she looks away, at which time they continue performing a skit. Artistic interventions with audiences shows playful interactions that depend on placement of the camera, showing a way of communicating machine personality using playful face-detection-based interactions.
Detecting tangibles on capacitive touchscreen has seen vast attention over the past decade. Current state of the art allows a capacitive touchscreen to detect and identify a number of tangibles based on a unique footprint of interconnected conductive elements on the base of the tangible. In most cases, this conductive material is either metal or some carbon-based conductor, possibly integrated into a 3D-printing process. The choice of conductive material is, however, not limited to these technical elements. In this paper, we showcase how the concept of tangible detection on capacitive touchscreens can be transferred to pastry, creating an ephemeral, edible user interface. The detection and identification of specific pieces of pastry open applications in the area of entertainment, but also food safety.
This work-in-progress paper presents three tangible user interface prototypes that use sensor technology in combination with mechanical actuators to provide an interactive physical display of sound and music data for people with visual impairments. The prototypes can be used either separately or combined as input and output devices utilizing pin-based, string-based and wheel-based interaction elements. They were developed as part of the research project Tangible Signals. In the paper, each prototype will be presented separately, including discussion of concept, interaction modalities, hardware design and possible use cases.
An Interactive Garment for Orchestra Conducting: IoT-enabled Textile & Machine Learning to Direct Musical Performance
We present an overview and initial results from a project bringing together orchestra conducting, e-textile material studies, costume tailoring, low power computing and machine learning (ML). We describe a wearable interactive system comprising of textile sensors embedded into a suit, low-power transmission and gesture recognition using creative computing tools. We introduce first observations made during the semi-participatory approach, which placed the conductor’s movements and personal performative expressiveness at the centre for technical and conceptual development. The project is a two-month collaboration between the Verworner-Krause Kammerorchester (VKKO), technical and design researchers, currently still running. Preliminary analyses of the data recorded while the conductor is wearing the prototype demonstrate that the developed system can be used to robustly decode a large number of conducting and performative movements. In particular the user interface of the ML system is designed such that the training of the algorithms can be intuitively controlled by the conductor, in sync with the MIDI clock.
Exploring Axisymmetric Shape-Change’s Purposes and Allure for Ambient Display: 16 Potential Use Cases and a Two-Month Preliminary Study on Daily Notifications
In the last decade, HCI research has proposed promising technologies for shape-changing interfaces. The usefulness and the user experience of shape-change are, however, still to be explored and understood. This paper extends the understanding of the potential utility and usability of axisymmetric shape-change. First, we present 16 potential use cases for a cylindrical shape-changing display. Second, we present a two-month comparative field study in the workplace. Six participants had to shift energy consumption by using energy storage. To do so, they were notified about local energy forecasts. Compared with flat-screen animations, early results show that cylindrical shape-change animations keep a better attractiveness over time.
LayerPump: Rapid Prototyping of Functional 3D Objects with Built-in Electrohydrodynamics Pumps Based on Layered Plates
The development and widespread use of digital fabrication techniques has accelerated the fabrication of three-dimensional (3D) objects, enabling not only to design their shapes but also to customize their internal structures and materials to realize various functions. In this research we propose a novel design method to embed internal pumps and liquid paths inside 3D objects. We utilized a pump driven by electrohydrodynamics (EHD) of fluid migrating in response to an applied electric field. To arrange the EHD pumps and fluid paths in appropriate positions inside 3D objects, we stacked layered acrylic plates cut with digital fabrication tools such as laser cutters and cutting plotters. Using these embedded pumps, we can control the flow of fluid to change color, shape, and motion of the objects. In this paper, we detailed our fabrication techniques named LayerPump, and demonstrated its design space and application scenarios.
Museum exhibitions on traditional Chinese paintings are gaining popularity for educational and cultural value. Chinese paintings are characterized by a long history and implicit emotional expression, and it is challenging for non-professional and non-Chinese visitors to understand. To enhance museum visitors’ interest and comprehension of Chinese artworks, we design an EEG-based interactive installation. The installation simulates the process of creation of a work of art, in this case a painting. Visitors can control the generation of lines, colors, and movements of characters by wearing a commercial EEG headset. Our interactive design contributes a novel experience of ’painting with your mind’ and at the same time transform the exhibition into an enjoyable game experience.
Kinesthetic Empathy in Remote Interactive Performance: Research into Platforms and Strategies for Performing Online
In interactive performance, kinesthetic empathy refers to the ability of performers to read, decode, and react to one another’s physical input. To develop kinesthetic empathy, performers rely on sonic, visual, and other sensory cues. Remote and interactive performance oftentimes limits these cues due to deficiencies of the virtual environment. In this case, performers must develop connections between alternate sensory modes to achieve kinesthetic empathy. This paper explores alternative systems for remote performance and investigates the ways in which human players creatively exploit these platforms as well as defining pre-requisite sensory connections needed to achieve kinesthetic empathy between remote participants. We present three examples of technologies and performance techniques used to achieve this connection. We then present a new system modified for remote performance and propose a strategy for demonstrating it to peers in order to discern its effectiveness in facilitating kinesthetic empathy between multiple players as well as players and audience. We use current research data in cognitive psychology as a baseline for our own inquiry and hope our experiences will inspire future research in that field.
We introduce a DIY method of creating inflatables and prototyping interactive materials from wasted thermoplastic bags that easily found at home. We used a inexpensive FFF 3D printer, without any customization of the printer, to heat-seal and patterning different types of mono and multilayered thermoplastic bags. We characterized 8 different types of commonly-used product package’s plastic film which are mostly made of polypropylene and polyethylene, and provided 3D printer settings for re-purposing each material. In addition to heat-sealing, we explored a new design space of using a 3D printer to create embossing, origami creases, and textures on thermoplastic bags, and demonstrate examples of applying this technique to create various materials for rapid design and prototyping. To validate the durability of the inflatables, we evaluated 9 different thermoplastic air pouches’ heat-sealed bonding strength. Lastly, we show use-case scenarios of prototyping products and interface, and creating playful experience at home.
This paper introduces a preliminary taxonomy to bring the condition of air into the foreground of human perception. To create this taxonomy, we drew on the foundations of atmospheric research and studies in the field of human-computer interaction to provide an overview of different inputs and outputs that enable an interaction with the air. In addition, we present a potential use case that could benefit from a taxonomy to allow the development of atmospheric interfaces and empower the transfer of knowledge. We discuss our findings and conclude with challenges that can be addressed in future research.
Impressions at First Touch: Insights on how visually impaired persons form their first impressions of technology
Human judgements are substantially influenced by first impressions. In previous studies, researchers contributing to people’s first impressions of technical artifacts focused mainly on visual attributes. However, their findings do not apply to visually impaired people who cannot visually explore technology. Hence, we assume that visually impaired people rather rely on their haptic perception to get a first impression. To examine how visually impaired people form their first impressions of technological products, we conducted an explorative study with three visually impaired participants. We asked them to evaluate haptic features of mobile phones and speakers using the repertory grid (RepGrid) method. This method can be applied in research fields at an early stage when no findings are available yet. To empower the participants to autonomously rate items, we used a haptic scale. We complemented qualitative results of the RepGrid technique with observations on how long participants explore technology as well as a following interview on first impressions. We found eight constructs which can serve as a basis for a quantitative evaluation on how devices make a haptic first impression.
In augmented reality (AR), gesture/hand-based interactions are becoming more common place over tangible user interfaces (TUIs) and physical controllers. Major concerns regard portability and battery-life with TUIs and physical controllers in an AR environment. For a TUI or physical controller to be usable ”on the go” and in the field, they need to be compact such that they can be easily deployable. Subsequently, it is desirable to be low-energy or powerless, such that the device remains functional after long periods. In this paper, we present our initial design towards a powerless, passive controller that leverages the optical camera of a AR head-mounted display (such as the Hololens or Magic Leap) for tracking. We discuss related work, design motivations, high and low fidelity designs, opportunistic/adaptable input modalities, and use cases. We conclude with propositions for future work.
We present TempoWatch, a wearable smartwatch-based interface designed to allow dance instructors to control music playback and tempo directly on their wrist via touch gestures using a circular watch display. Dance instructors have unique requirements with respect to music playback in their classes, in particular the ability to stay in position while controlling the playback, and to change speed via time-stretching. However, common stereo decks and mobile music player apps do not support these requirements well. We present the design and architecture of our system, and a qualitative evaluation performed with 9 semi-professional instructors in their own dance classes. Dance instructors were involved in this project from the very beginning to match the system and interface design to its prospective use cases. Results show that instructors are able to use TempoWatch productively after only a short learning phase.
Participatory design with wearable users entails engaging people in the design process from the early ideation phases. However, user-generated wearable concepts are often limited by the narrow design space of commercially available wearables. This paper presents an ideation scaffolding method we developed for eliciting wearable concepts, called Wearable Crazy Eights, where participants used an ideation deck and sketched up to 8 concepts in 8 minutes. Herein, we discuss the artifacts produced from our ideation method in a study with 46 participants comparing 3 groups. By comparing the 3 groups we were able to parse the effects of each activity on the resulting ideas. Our contribution is a replicable and customizable ideation method for encouraging outside-the-box thinking in wearable studies.
Raising People’s Awareness of Hearing Health with Pleasant and Interactive Hearing Test Designs Installed in Public Spaces
Hearing loss is an emerging health challenge, but very few people check their hearing periodically. Unidentified hearing loss can lead to depression and social isolation. To improve public awareness of hearing health, we design an interactive testing system for being used in public spaces. It aims to create unexpected encounters with users and engage them in examining their hearing health regularly. We conducted two evaluations and reported the results in this paper. Firstly, we compared the hearing test results of our design and the conventional audiometry testing. The correlation is 0.78 on average. Secondly, we evaluated the user experience with 12 visitors in an art museum. The nature sounds and interactive design provided users a feeling of calm and made the testing an engaging experience. Meanwhile, the visualization triggered users’ discussions and reflections on the testing results. Overall, our design shows the potentials in raising people’s awareness of hearing health.
The Jam Station is a collaborative musical experience (CME), with embedded sensors to detect, evaluate, and reinforce collaboration among players of all skill levels. The system is also gamified, consisting of four instruments that face a centrally located, vertically oriented, display that visualizes various game states. Playing any of the instruments affect the game as musical collaboration is assessed and visualized on the display. The display also includes a “progress bar’’ which fills up the more players collaborate musically, and to “win’’ the game, the progress bar must be full, initiating a multi-color light show for the players. By creating an environment focused primarily on collaboration, we aim to support collaborative music-making for musicians of all skill levels.
We present “Plantimus”, an external sensory enhancement device that enables children to ‘hear’ plants. By embedding a digital layer on top of the physical world, we encourage children to step-out and reflect upon our inherent connection to nature as well as our responsibility towards it. Aimed to develop abstract thinking and self-exploration, the Plantimus is a two-piece device resembling a stethoscope, that generates different sounds when directed towards plants. The design process, as well as the technical implementation are described in detail. Insights from the initial user testing are presented, showing that children react positively to the experience of ‘hearing’ plants; and that the experience is also intriguing to adults. Future work involves advancing the plant recognition method and the specificity of the plant-sound relationship and testing the device within new scenarios of use.
Asynchronous Co-Dining: Enhancing the Intimacy in Remote Co-Dining Experience Through Audio Recordings
Co-dining benefits in closeness and connectedness between all family members. However, it is hard for those in different locations to enjoy a meal together due to the geographical barriers and inconsistent life schedule. The widely applied video technology caused excessive visual interference when solving problems, while the effect of sound in a meal on intimacy has not been studied. In this paper, we explored how can recorded audio create an asynchronous co-dining experience and support intimacy. We conducted a field study with 24 participants, and the result shows the system potentially enhance the intimacy in remote coding experience. In the follow-up workshop with 6 participants, we further discussed the future landscape of designing a sound interface for remote meals.
Animals in managed care ideally are provided with environmental stimuli for their psychological and physiological well-being. Most commonly, food-based enrichment methods are used to mimic wild conditions, allowing animals to search and forage. Other sensory stimulation devices may be employed to reduce stereotypical behaviors. However, many of these devices fall short in providing choice and control to animals, an important factor in cognitive engagement. In this paper, we present a case study conducted at the San Diego Zoo that explores such design for two elephants. The resultant system, called Soundyssey, is an active auditory enrichment device for elephants that encourages play behaviors through embodied interaction. Our preliminary results indicate that elephants understood the system and interacted with the interface significantly longer than with passive objects. During the interactions, elephants showed more positive behaviors such as focused exploration, reducing the possibility of negative stereotypical behaviors. We suggest that such technologically-enhanced objects and embodied design can enhance standards of managed animal care.
This work-in-progress paper describes an investigation into the somaesthetic qualities of exploring posthuman strategies and relations with technology. This was done through bodystorming and physical prototyping, casting soft robots directly onto the body. Through a Research-through-Design process, a soft robotic prototype, Nautica, was developed. This prototype is based on embodied design ideation, autoethnographic bodystorming and prototyping in soft robotics. Nautica explores the concept of an embodied navigation system that plays with re-embodying the disembodied virtual body found in digital navigation systems. The Nautica prototype was tested directly on the designer-researcher’s body which led to discerning preliminary emergent properties. Over a short-term adoptive study it was discovered that several “material” commonalities between the wearer and robot emerged due to the materiality and behavior of soft robots. These insights may suggest new ways of experiencing hybrid somatic relations with technology. This work-in-progress ends on suggesting further work such as connecting the soft robot to a real, functioning navigation system and to expand the testing of the Nautica prototype.
Reducing the environmental impact of current food production and consumption practices is a significant challenge for a more sustainable future. Even though previous HCI studies illustrated that design interventions could support more sustainable food practices in domestic contexts, little attention has been given to hospitality contexts (e.g., restaurants). Addressing this gap, we first investigated food preparation practices in restaurants (i.e., how recipes, menus and meals are prepared) through interviews with 10 chefs and instructors of culinary arts. Then, we designed KNOBIE, a design intervention aimed at supporting chefs’ sustainable food preparation practices through better recipe and menu planning. In this paper, we present the results of these interviews, KNOBIE as a concept to support sustainable recipe planning and how interviews guided its design.
Tactons are vibrotactile patterns used to convey information. Conventionally, a small set of tactons is used on mobile phone or wearables to notify users of messages or emails received. However, tactons have multiple characteristics that one can vary to design a multitude of vibrotactile patterns. In addition, one can place vibrotactile actuators on different body areas to leverage the spatial dimension and the varying skin sensitivity across the body. Prior work proposed methods to design tactons based on musical or engineering knowledge, but hands-on methods remain scarce. For this studio, we adopt a hands-on approach for composing tactons. We leverage a simple instrument-like device that consists of vibrotactile actuators connected to dedicated buttons for designing tactons. While pressing a button, the corresponding actuator vibrates. Users can vary several characteristics of the tactons (e.g., duration and amplitude), and experience them in real time during the design process by placing the actuators on the body. These tactons can then be shared with other participants of the studio. Our goal with this studio is to observe users compose tactons collaboratively using a hands-on device, and better understand how they lay out the vibrotactile actuators on their body and what differences these layouts make in the tactile experience.
There is a growing interest amongst the HCI community to access and articulate the core of experiences for design use. The premise is that, by accessing detailed accounts of everyday experiences, we can obtain refined material for the design of interactive systems more connected with our bodies and emotions. This TEI studio aims to introduce participants to the basis of phenomenologically grounded techniques in combination with the use of tangible materials as a way to articulate experience from the inner self, applied to the evaluation of existing technologies. This studio offers an alternative to assessment tools that rely on a predefined repertoire of feelings, to instead focus on emergent, complex and unclear aspects of our emotions. These strange collections of emotions -or felt senses- will be further explored through self-reporting tools and group exercises.
Material Meets the City: Exploring Novel and Advanced Materials for the Smart Urban Built Environment
The urban realm is currently undergoing a transformation in which cities are being laced with sensors and networks of ever-connected devices. At the same time, more and more novel and advanced materials are finding their way into interaction design and Human-Computer Interaction (HCI) research, offering us new interaction possibilities. In this half-day online studio, attendees will have the opportunity to exchange and reflect on urban interaction designs for user engagement based on a set of novel, unconventional, or omnipresent materials in urban environments. We will further collect historical, public, and community locations of different social and societal meaning. Participants will be asked to develop hands-on concepts and use cases for the selected locations through a material-centered approach and think about new levels of user-environment engagement to extend and explore the current design space. The formulated concepts and use cases will be recorded in an online collaboration tool with the prospect to publish them as enhances for the new generation of media architecture.
Designing for interactive play brings new dimensions for play experience blended with digital and physical characteristics. Starting from the insights of two interactive soft toy projects, the authors explore interactive soft materials in designing for playfulness through a material-driven design approach. This can bring a new dimension in designing interaction for ludic experiences and reveal specific characteristics of soft materials. By combining somaesthetic and embodied design methods, the studio aims at exploring what makes soft materials playful. Moreover, it addresses how technology can enhance the playfulness of these materials and what kind of playful interaction scenarios can be imagined with interactive soft materials by involving the whole-body in the design process. The learnings from the studio will help the authors build a framework which shows the potential and describes the characteristics of interactive soft materials to create new tangible, embedded and embodied interactions for play.
As immersive technologies are increasingly being adopted by artists, dancers and developers in their creative work, there is a demand for tools and methods to design compelling ways of embodied interaction within virtual environments. Interactive Machine Learning allows creators to quickly and easily implement movement interaction in their applications by performing examples of movement to train a machine learning model. A key aspect of this training is providing appropriate movement data features for a machine learning model to accurately characterise the movement then recognise it from incoming data. We explore methodologies that aim to support creators’ understanding of movement feature data in relation to machine learning models and ask how these models hold the potential to inform creators’ understanding of their own movement. We propose a 5-day hackathon, bringing together artists, dancers and designers, to explore designing movement interaction and create prototypes using new interactive machine learning tool InteractML.
Since the advent of Artificial Intelligence (AI) and Machine Learning (ML), researchers have asked how intelligent computing systems could interact with and relate to their users and their surroundings, leading to debates around issues of biased AI systems, ML black-box, user trust, user’s perception of control over the system, and system’s transparency, to name a few. All of these issues are related to how humans interact with AI or ML systems, through an interface which uses different interaction modalities. Prior studies address these issues from a variety of perspectives, spanning from understanding and framing the problems through ethics and Science and Technology Studies (STS) perspectives to finding effective technical solutions to the problems. But what is shared among almost all those efforts is an assumption that if systems can explain the how and why of their predictions, people will have a better perception of control and therefore will trust such systems more, and even can correct their shortcomings. This research field has been called Explainable AI (XAI). In this studio, we take stock on prior efforts in this area; however, we focus on using Tangible and Embodied Interaction (TEI) as an interaction modality for understanding ML. We note that the affordances of physical forms and their behaviors potentially can not only contribute to the explainability of ML systems, but also can contribute to an open environment for criticism. This studio seeks to both critique explainable ML terminology and to map the opportunities that TEI can offer to the HCI for designing more sustainable, graspable and just intelligent systems.
SESSION: Graduate Student Consortium
Vibrotactile feedback has found its way into human-machine interaction in recent years. However, there are no easily accessible and recognized rules for designing this feedback. For example, designers of graphical user interfaces can make use of visual design laws, to improve the design of human-machine interfaces. These laws identify a meta-level that allows to work with simple design principles rather than having to deal with the physiology or the psychological and cognitive stimulus processing of the visual sense.
The goal of my future research is to explore and identify a meta-level for the design of tactile feedback in order to provide tactile feedback designers with a tool that they can work with easily and effectively. A pilot study has already been carried out to this end, which has yielded illuminating results. Based on these results, I would like to sharpen the focus of my research work in the coming months and then work on my doctoral thesis.
This paper introduces a Ph.D. research project on the affective relationships between users and wearable sleep-trackers. Sleep monitoring is a recent addition to wearable technology, computing in the intimate sphere of the body. Wearable devices interact with the human body, often bypassing conscious thought and manipulating behaviours. The acceptance of sleep-trackers transforms the discourse of sleep from a passive, non-reflexive experience into an active, measurable performance. The sleeping body becomes part of a network of sensing devices where human behaviour becomes operationalised. This exploratory research project aims to untangle the effects of sleep-tracking on individual and social levels. Through analysis of language and discourse a first study aims to identify the human and non-human subjectivities such technology produces and the affective relationships between them. The findings of this project will provide a starting point for a posthuman approach to designing wearable technology.
This paper resituates multisensory augmented reality (MSAR) as an artistic medium for the creation of interactive and expressive works by computational artists. If an AR system can be thought of as one that combines real and virtual processes, is interactive in real-time, and is registered in three dimensions; why do we witness the majority of AR applications utilising primarily visual displays of information? In this paper, I propose a practice-led compositional approach for developing ‘MSAR Experiences’, arguing that, as an medium that combines real and virtual multisensory processes, it must be explored with a multisensory approach. The paper further outlines the study methods that I will use to evaluate the developed experiences. The outcome of this project is the practice-led method as well as MSAR hardware, software and experiences that are developed and evaluated.
Touching the Past: Developing and Evaluating A Heritage kit for Visualizing and Analyzing Historical Artefacts Using Tangible Augmented Reality
My research explores the use of tangible user interfaces and augmented reality to interact with virtual representations of historical artefacts within the cultural heritage domain. As a practice-led research, this research project aims to design, develop and evaluate tangible user interfaces combined with augmented reality to provide cultural heritage professionals and archaeology experts with an accessible and intuitive tangible augmented reality interface for the visualization, analysis and interpretations of historical artefacts. This research is interdisciplinary in nature combining the fields of design, HCI and heritage; and the newly explored interaction methods will suggest new possibilities for the future of Tangible User Interfaces and augmented reality research in general and cultural heritage in particular. In the first year of my PhD, I developed a prototype that showcased a tangible interface to interact with augmented historical artefacts, and preliminary testing with experts yielded very positive results. In the second year, the research will investigate how different tangible interaction methods in augmented reality can promote realism and immersion and enhance the user experience.
Designing Interactive Technological Interventions for Menopausal Women: Designing and developing Interactive Technology tools to help aging women navigate information about stages of menopause to increase self-awareness of biopsychosocial changes and manage lifestyle for an improved Quality of Life
The experience of aging differs vastly for men and women. For women, the often-slow transition that ends their reproductive capacity – menopause – closely relates to their experiences of aging. Menopause has a significant impact on the biological, psychological and social aspects of a woman’s life. While research suggests that education, guidance, and self-management techniques are beneficial for improving menopausal women’s health, longevity and quality of life, most women lack access to credible resources. Self-tracking technology offers women promising alternatives, but little has been designed to support menopausal women’s health. This doctoral research focuses on developing interactive self-tracking tools to help women access information about their menopausal journey, increase self-awareness of biological and psychological changes, and provide techniques to help adapt and manage their lifestyle. This paper outlines the process and plan for developing an ethnographic web application and an interactive wearable tracking device to help women better navigate their menopausal journey.
This paper introduces a doctoral project on personal wearable light spaces. The topic combines textile material design with somaesthetic interaction design, investigating possible socio-cultural implications when humans become light sources. The project applies material speculation to explore future scenarios through physical prototypes. Objectives and the research question are stated, and the methodological framework is described. The project timeline and expected research contributions are listed.
SESSION: Arts and Performance
AuxeticBreath is an interactive new-media installation that visualizes the rhythmic respiratory rate, as well as tidal volume – the amount of air displaced or exchanged in a single breath – of collective human breaths using soft robotics covered with auxetic structures (i.e. structures with a negative Poisson’s ratio, exhibiting the property of becoming thicker when stretched and narrower when compressed) . The goals of this artwork are 1) to encourage audience interaction with collective breaths and user contemplation of the changing perception of respiration during the COVID-19 pandemic; and 2) to explore a new artistic approach using a combination of auxetic structures and soft robotics. The metaphors and artistic expressions of continuous inflation and deflation of elastomers, and the emission of light from the expansion of auxetic structures invite an individual’s presence to become part of the larger collective installation, and to take a moment to consider underlying changing perceptions of breath during the pandemic. By employing an emerging technology, we want to encourage other artists to explore and modify techniques and methods generally only used among engineers, and to embrace them as new artistic approaches for realizing their own ideas.
Orrery Arcana is a system for real-time improvisational performance designed to facilitate a process analogous to automatic writing. The system includes custom software for real-time signal processing written in Max/MSP and Python and a self-made hardware controller that is integrated with a planetary gear train, which gives the performer control over timing and sequenced events through manual gear rotations. Each gear is equipped with a sensor plate with embedded light, magnetic, and capacitive-touch sensors. The sensors are primarily manipulated via tactile modular control objects in the form of concentric rings of colored acrylic and inset magnets that correspond to Tarot cards.
Roads in You is an interactive biometric-data art and physical 3D data visualization by using a vein matching process to roads. Veins are beautifully complicated forms hidden under skins. It includes many intersections of roads that resemble the roads and paths surrounding us. In this artwork, participants scan their veins, capture the aesthetically formed vein lines, and observe existing roads in the earth that match to the veins inside of their bodies. The surprising results discovered through this process are visualized in both interactive 3D map visualization and 3D printed sculptures as personal souvenirs as a part of the artistic experience. Unique aspects such as artistic expression and situated metaphors in this artwork invite audiences to experience a unique matching process and discover newly meaningful roads for them. Enhanced 3D data visualization and physical data sculptures are new additions to this updated version. This data-driven artwork aims to find an interesting correlation between biometric data and map-based data in terms of research questions.
“Footsie” is an interactive kinetic art installation. It inspires to connect people through the counter-intuitive physical interaction that is mediated by flirtatious machines. It consists of a table and four robotic chairs; the chair legs carry the motion of “footsie” and arms caress the back of the participant. While it tackles the controversial culture and social norms on touching others body or being touched, an experimental bodily experience explores new aesthetic interaction between lonely human and a mischievous digital entity. The interaction is provocative in a slightly uncomfortable way, and the paradoxical feelings and stimulating sensations open simultaneous interpretations on the meaning of Human-Machine-Interaction. Through nature inspired design like human spine, flexible muscle of mollusca and a caterpillar’s undulating wave motion, these pliable structures were fabricated with 3D printed meta-materials to create the sensual gestures.
The performative art installation is a rose uses plants as natural interfaces in order to establish a direct relation between natural and technological systems. The installation is used to visualize digital-physical interaction that is not necessarily explicit – triggered by touch or air movement, by direct and non-direct manipulation, depicting the sum of all interactions during the course of the day. The technical realization consists of detection of the movement of plants caused by the movements in their immediate vicinity and the subsequent deformation of a computer-generated sphere. The paper is explaining several different layers of meanings the artist was motivated by when developing the artwork.
Topographie Digitale is an interactive installation that illustrates a hybridization between science and traditional textile craftsmanship. It uses electrically functionalized and pleated textiles as touch sensitive surfaces for interacting with a video-projected visualization. The pleated fabric, augmented by our custom chemical process, and the electronic sensing system give birth to a material with a mixed heritage that is both technological and traditional, and prefigure an emerging craft.
The combination of craft and technology, which gives a creole technique, is an alternative way of thinking about the place of digitalization in our society in a more resilient way.
SESSION: Student Design Challenge
Extending the life of our clothes is the most effective intervention of all current sustainable textile practices. Taking this into consideration, this paper explores how we can upcycle and repair our clothes with e-textiles, and how to share these techniques with other makers and crafters. To do so the author interviewed 4 visible mending educators about their teaching practices and personal experiences with mending. These interviews were then used to inform the design of the E-Darning Sampler for the TEI 2021 swatchbook, which includes examples of different e-textile darning patterns and functionalities made with darning looms. This paper contributes insights on how to design educational samples for encouraging sustainable making practices.
Slider Swatch: an inconspicuous switch design solution for electronic textiles: Development of a slider switch from a simple on/off mechanism to different outputs
In this paper we would like to explain our approach to the development of this swatch, from the idea to the possible applications of the final product. We will explain how we came up with the basic idea, which questions we asked ourselves and what kind of materials we used to achieve the desired result. Furthermore, we will also explain what problems we had to deal with during the development process and how we found a way to solve them. Moreover, we will have a look at how our Swatch can be integrated into everyday life and what possibilities its use could offer. It is important to us to convey our way of thinking and working during the process, so we also added a little step by step guide that tells how the swatch is made.
Using the power of pixels in the physical world: Bridging digital visual tools to control the world of physical computing
Mapping values is part of the craftmanship of tangible interaction design. It is easy to imagine but hard to put in words and even harder to code. This swatch presents ways that might allow us to move faster from idea to experience. Through direct manipulation the experience becomes some sort of ‘clay’ that we can experience and manipulate simultaneously. This is done by using pixels to control the physical world and allows the use of software in the graphical domain to be used in the physical computing domain.
Fiber-sense: the exploration of the craft and material of fiberglass as a medium for tangible user interfaces.: Towards the development of embedded circuits in fiberglass-based composites and designs
In this investigation we present a material driven inquiry into fibreglass and the hand lay-up technique of composite lamination, as well as its affordances for the development of a tangible user interface. This was achieved by taking an in-depth look at woven fibreglass and its propensities as a craft material through its varying states of pliable fabric, wetted mesh and hardened composite. Through this, we propose the potential design of a pressure activated fibreglass skin for rigid and semi rigid fibreglass structures. Primarily through the means of a tri layer laminated fibreglass structure, with a polypyrrole treated fibreglass core sandwiched between 2 embroidered conductive fibreglass layers.
The Undyeing Swatch utilizes a combination of visible Light Emitting Diodes (LEDs) and photocatalytic nanoparticles to diminish the color of organically dyed textiles. As such, this swatch explores ’undyeing’ as a design process that utilizes light and dye as materials for controlled interaction. The swatch itself consists of a knitted cotton I-cord coated with silver doped titanium dioxide nanoparticles (TiO2/Ag) and dyed with hibiscus. The I-cord is used to encase a strand of LEDs and was then continuously woven into a swatch. When the LEDs are turned on, the light activates the nanoparticles which in turn break down organic matter (the dye). This swatch provides a proof of concept for the undyeing process, which I believe could be an interesting area of future exploration for HCI researchers and artists alike.
Traditional handcraft and modern cyborg culture share a common goal: democratize creation through demonstrations and education. Cyborg Crafts blends techniques from the fiber arts with cyborg-inspired technologies (e.g., open-source biosensing EEG headsets and RFID implants). Second SKIN (Soft Keen INteraction), intended to support this practice, is a handmade collection of four modular soft wearable sensors with a temperature-dependent dynamic display. Each sensor has unique sensor-specific outer shell textures based on non-woven textile techniques, and each supports a different sense: momentary switch, pressure sensor, pinch sensor, and a gesture-detecting, capacitive touch sensor. The interactions of pressing, pinching, and touching are encouraged by sensor-specific extruded designs lending to finger placement. The outer shell textures are made from a mixture of flaxseed mucilage and silicone rubber. Thermochromic pigment additives endow display functionality to these passive devices through the application of heat in excess of 86°F.
We present polymerized sports tape as a general purpose prototyping and sensor-design resource. We use it in a somewhat similar manner to how conductive copper tape is often used for fast production of lo-fi electrical prototypes. However, copper tape has a number of drawbacks if one is prototyping with soft materials, textiles, or even directly on the human body. Because it is not elastic, it does not adhere well to such materials. Sports tape, however, has the desired elastic qualities, and additionally is designed to adhere not only to arbitrary objects, but also to human skin. We polymerize sports tape to make it conductive. The resulting conductive tape has a series of electrical properties which make it an exciting sensor-material. It is piezo-resistive. This means its resistance changes with applied forces. It can also be used to create a voltage gradient, which enables simple and precise position sensing. This unique combination of mechanical and electrical properties makes it an exciting and unique prototyping resource.
Plant-Human Embodied Biofeedback (pheB): A Soft Robotic Surface for Emotion Regulation in Confined Physical Space
“pheB,” is a robotic “plant-human, embodied, biofeedback” system to support the wellbeing of human inhabitants in confined, physical spaces. This surface aims to increase users’ emotion regulation and foster connections with nature by visualizing the internal states of plants through tactile, expressive movement. Unlike 2D biofeedback visualization models currently in use, our research explores mindfulness practices through immersive, tangible interactions to increase therapeutic effectiveness. This pictorial traces the development of our design (to-date) and presents results from an early user study conducted to (a) assess the prototype at leading breathing exercises, (b) evaluate preferences for different design features, and (c) refine the design of a questionnaire for future user testing. Findings suggest pheB was perceived positively and as an embodied extension of self during guided breathing exercises. This work contributes knowledge toward developing novel biofeedback modalities and offers a design exemplar for interactive artifacts that nurture meaningful relationships with nature.
This work presents Magnetform, a shape-change display toolkit designed to enable exploration of movement in soft materials. The toolkit allows designers with no technical knowledge to leverage their material expertise to experiment with shape-change. We present the toolkit design, and case studies with two design studios who used the toolkit for 15-days each. Through the presentation of their process, we reflect on two main themes: empowering designers to participate in shape-change exploration; and the developing practice of designing objects which integrate motion. We situate this work as part of the growing efforts in the TEI community to involve designers in the evolution of shape-changing interfaces, and demonstrate how material-oriented designers could contribute to this field in which materiality plays a major role in.
In this pictorial, we depict our design process on gaming wearables starting from participatory design workshops to concept creation. Wearables possess strong qualities for gaming such as performativity, sociality and interactivity. However, it is an emergent field and there is a dearth of design knowledge especially when it comes to designing wearables for mainstream gaming platforms such as game consoles. Our aim is to explore this field elaborately with a research through design approach and also clearly exemplify how our design process progressed through different phases. Our results, apart from helping wearables designers to understand critical features for mainstream gaming, will also demonstrate the techniques and methods for extracting knowledge from PD workshops and incorporating it in a conceptual design phase.
What’s Going Ön?: An Experiential Approach to Perspective Taking in Urban Planning through Virtual Reality
Citizen participation in urban planning has become more important in the past decades. In recent years, the practice of co-creation has become a popular tool to structure this participatory process. Co-creation is a process that actively involves stakeholders in cooperatively sharing and developing opinions and perspectives towards an outcome that contributes to the overall development. In particular in large and slow processes such as urban planning, co-creation risks becoming abstract and difficult to engage with for stakeholders. This study investigates an experiential approach to participation in urban planning. By building on embodiment to engage participants physically and expressively, this study aims to elicit an understanding between different points of view. With the design of the “What’s going Ön?” experience we explore how Virtual Reality (VR) can play a role in the participation of citizens in the urban planning process of the Ön island in Umeå, Sweden. We present the designed experience and the qualities that play a role in using VR as a tool for participation in Urban Planning.
Textile Game Controllers: Exploring Affordances of E-Textile Techniques as Applied to Alternative Game Controllers
Invested in increasing access to computational literacy, this paper explores the development of a series of free public workshops in partnership with an equity-seeking group. These workshops cover e-textile techniques that lend themselves to making alternative game controllers leading up to a concept-led game jam. We use research creation approaches to prioritize creative exploration within a community group for marginalized makers. The goal of the research is to explore and elucidate the overlap between e-textiles and experimental game making. We discuss our playful use of workshops as research method to iterate on the embodied experience of making on behalf of our participants. Our contribution maps the connections between workshop design and development, learning materials generated, through to application within an online game jam setting.
Digital tangible interfaces have been used to present or embed interactive stories for decades. While many researchers explored different ways of creating engaging interactive experiences, they focused more on the novelty of the interface and the new experience it brought for users, rather than the exploration of tangible interfaces as a language, or a medium, for storytelling. Through a visual journey of our creative explorations, this pictorial reflects upon the design and making of Letters to José, a tangible narrative built during a three-year practice-led research project. Located in the intersection of tangible interaction design and interactive storytelling, our work characterizes the relationships between the artifact, other artifacts, the body, and the space with respect to supporting the narrative experience. The outcome is an annotated visual typology of artifacts for storytelling in the context of tangible narrative, as a design category and toolbox for researchers and creators of tangible stories.
Living dislocated from family, friends or partners can result in negative emotions and a lack of physical, bodily closeness. We focus on materialising the negative consequences associated with living far apart in a textile form, and manifest those in two pairs of wearable artefacts: first, WARMTH, which simulates the bodily distance between remote people through a decrease in felt temperature, and second, BREATH, which embodies the bodily absence of a remote other through a decline in mechanical movement of textiles. The wearables are ‘discussion artefacts’ that enable conversation about, reflection on and exchange of personal, negative, vulnerable and challenging emotions connected to living far apart. The stance taken in this pictorial emphasises the necessity to focus not only on overcoming and bridging the distance between remote people through interactive artefacts; but also, to consider and manifest melancholic and possibly negative personal experiences.
In this pictorial, we offer an overview of the design and step-by-step guidance on crafting a leather self-tracking device for pollen allergies. We designed the device to support patients’ daily lives allowing them to track symptoms and their severity and medicine intake while on the go, multiple times per day. The self-tracker sends the data to an accompanying smartphone application to support patients in understanding their illness, be used in doctors’ consultation, and collect data for allergy research. We chose natural leather as a material for this domestic medical device due to its associations with everyday artifacts, so it would be easier to integrate it into allergy patients’ everyday lives.
Interactive installations face several challenges when used in public or semi-public contexts, such as users not noticing interactivity, or refraining from interacting due to the fear of social embarrassment. Several interaction models proposing solutions to these issues have been established over the years. However, most of these models focus on single-surface installations, which have to be approached for interaction. When the users’ own devices (ODs) are included in the installation, though, the interaction focus shifts. To determine the influence of the ODs on the applicability of existing interaction models, we developed a multi-device quiz game, the “Weisskunig Quiz”, according to these models. The installation was deployed in a real museum exhibition and tested in two qualitative evaluations with random museum visitors and a school class. We describe our design approach, reflect on the found differences, and propose an extension for single-surface interaction models by incorporating the users’ own devices.
Home decor defines how people experience and share spaces, with the decorative elements forming the ‘interface’ to the home. Despite the opportunities of embedding technology within these elements, research to date has not explored this fully. This paper brings home decor to interaction design utilizing decorative elements as a vehicle to incorporate tangible interaction in domestic spaces. In an IKEA-like format, we designed a product catalogue of our own prototypes that illustrate the possibilities of the nearest future. These design illustrations should offer inspiration to those who wish to work with interactive materials (e.g. appearance-changing and soft-sensing), particularly in the context of interactive spaces. Through making, situating, and speculating, we show how designing interactive decor can be a promising area in Research-though-Design.
Insulin pumps are effective tools for the precise control of glucose levels, for type 1 diabetes (T1D) patients. Unfortunately, many design and usability challenges still exist with these technologies. We investigated current shortcomings and limitations through survey (N=105), interview (N=7), and participatory workshop (N=3) data collection methods. Our findings revealed issues with current technolog y including wear-ability and accessibility in public, operation while performing demanding tasks, interruptions during social activities, continuity of maintenance, and interface operations. Using the data from our investigative work, we produced design criteria to develop a novel wrist-worn interface and separate pump design for a closed loop system. We then evaluated the design through remote usability testing sessions (N=7) with insulin pump users. Our study aspires to inform the future design of novel insulin pumps that enable people with T1D to maintain better control of their glucose through consistent interactions with these tools, during their everyday activities.
‘Crafting Stories’ investigates the potentials of smart and electronic textile craftsmanship in the context of interactive books. A handmade interactive eTextile book, copying a handmade interactive textile book my late grandmother made, is presented as a prototypical artefact. The presentation of this practice-based research at the intersection of textile crafts, electronic and computational technology, and storytelling explicitly focuses on the interactive qualities, materials used, and the story experience. It introduces and contextualizes ‘The Book My Grandmother Might Have Made’, annotates design decisions, and pictures the making of the book. I report on early explorations of users interacting with the crafted stories and close with discussing the possibilities and perspectives of crafting tangible, embedded, and embodied interactions within (eTextile) books.
Physical computing concerns the design of systems that can sense and respond to the world around them, which is why it is often used in interaction design projects in educational settings. However, students who encounter physical computing for the first time are typically not aware of the form factors and the potential for interaction of the various sensing and actuating possibilities. To complement existing touchpoints that these students have with physical computing, we present electronic smörgåsbords: boards that display collections of physical computing components that are available inhouse in an organised and interactive way to support the initiation of interaction design projects. The development of the boards allowed us to articulate four principles for their design, which are intended to inspire the development of future educational material.
Large-scale kinetic displays have been present for decades, both as mesmerising art installations and as information displays. In this paper we present KINEIN, a kinetic display whose large-scale and indefinite deployment time distinguish it from the prototypical level of HCI research artefacts. It consists of 625 “tiles” or “pixels” mounted on a 25 x 25 matrix and operates through mechanical movement. Instead of a motor rotating each tile individually, KINEIN has a vertical column system that moves tiles sequentially. Situated at the entrance of a learning environment, KINEIN is not only a playful artefact for children to physically interact with, but also the first encounter for the school’s visitors.
We make use of the pictorial format to illustrate and contribute our Research-through-Design (RtD) process as well as its implications regarding large physical scale and indefinite deployment.
This Pictorial discusses the outcomes of a distributed embodied ideation workshop. In this workshop, students of a Bachelor program on Management by Design explained their thesis by visualizing and externalizing the problem space using everyday objects. The use of objects to represent and understand complex design problems is well documented in the fields of Design and Human-Computer Interaction, but is often researched in collocated settings. What happens when the design tools used to externalize and represent complex design problems are everyday household objects, and the discussion happens in a distributed, asynchronous manner? How is shared meaning negotiated and established? In this Pictorial, we discuss six pictures that were taken of the visual representations that the students constructed, and reflect upon the distributed, asynchronous process that was employed to create, develop, and agree upon meaning.