TEI '23: Proceedings of the Seventeenth International Conference on Tangible, Embedded, and Embodied InteractionFull Citation in the ACM Digital Library
Modular Tangible User Interfaces (TUIs) –i.e., UIs made of small-scale physical modules– offer novel opportunities for tangible interaction thanks to their highly customizable form factor. Such modular TUIs were proposed with different shape of modules and bonding strength between them. The problem we address in this paper is the lack of knowledge of how bonding strength and shape of the modules impact usability. We present the first study exploring the impact of bonding strength and module shape on subjective user ratings when interacting with a magnetic modular prototype. We assessed three levels of bonding strength (low, mid, high) and two shapes (cubes and rounded cubes) in a controlled user study. Participants performed eight common manipulations found in the literature for (non-)modular TUIs. Experimental results showed that (1) cubic modules are overall easier and more satisfying to manipulate, except for precision and bending tasks, (2) low strength impairs UI solidity, but high strength impairs precision tasks with cubic modules.
Abstract mathematics can be difficult to grasp, in part because it relies on symbols and formalisms that are powerful yet meaningless to novices unless grounded in concreteness. Although a wide corpus of research focuses on concreteness in mathematics education, the notion of concreteness can be apprehended in various ways and it is not yet clear which specific aspects of concreteness help the learners. In this paper, we explore embodiment as a form of concreteness to ground abstract mathematics. First, we designed and evaluated an embodied learning activity on graph theory. Through a user study with 89 participants, we then compared three approaches: abstraction, manipulated concreteness, and embodied concreteness. Our results show that, compared to abstraction, both forms of concreteness increase learners’ perceived attention, confidence, and satisfaction. However, only embodied concreteness increases perceived relevance and grounding. Moreover, unlike manipulated concreteness, embodied concreteness does not impair learning outcomes nor transfer abilities.
Adaptive Soft Switches: Co-Designing Fabric Adaptive Switches with Occupational Therapists for Children and Adolescents with Acquired Brain Injury
Acquired brain injuries have many complexities, largely affecting motor and cognitive functioning. Occupational therapists often use switches attached to electronics that activate the devices to give people with disabilities the ability to interact with toys and electronics. However, current switches on the market are expensive, break easily and are unable to customize. We ran two co-design workshops and follow-up interviews with 14 occupational therapists specializing in students with acquired brain injuries. In phase one, the occupation therapists built three soft switches and brainstormed iterations. In phase two, we gained valuable insights into the iterations from occupational therapists. This paper contributes to Human-Computer Interaction as a case study, designs guidelines to support co-design with occupational therapists, and discusses the potential of adaptive soft switches. This work contributes to the growing literature around supporting occupational therapists as makers and how researchers can support them during the co-design process.
Although much work has focused on designing touch interfaces for primates, little is known about how physical computer buttons for monkeys would look. Here, we employ the rapid prototyping method commonly used in human–computer interaction to design tangible buttons for monkeys allowing them to interact with computer enrichment. Our findings reflect on the process of altering rapid prototyping from humans to animals and how computer buttons for monkeys might look. On this basis, we make suggestions for monkey buttons, highlighting colour and pull/swing over push/touch interactions. We also reflect on lessons learned from transferring prototyping across species, such as the need to iterate on a few variables and for initial prototypes to be varied, and speculate on how to balance the designer (human) and user (animals) needs. More broadly, this paper builds upon HCI prototyping techniques for unconventional users, creating a method for rapid iterative prototyping with animals.
Researchers support race, gender, and age diverse groups of people to create with maker electronics. These groups include older adults, who are often overlooked as not interested or capable of learning new technologies due to ageist stereotypes. One approach, often involving e-textiles, leverages crafting as a bridge to broaden participation in making. We investigated ways to broaden participation in maker electronics for older adults by remotely co-designing e-textile projects with 6 older adult crafters over the course of 5 workshop sessions for a total of 45 hours. We developed a deeper understanding of their practices, identifying a Planner-Improviser Spectrum for how they approached their craft, and created medium fidelity prototypes. Our design implications draw on our participants’ crafting experience and their experience in the workshop to highlight what e-textile toolkit designers can learn from skilled older adult crafters, such as selecting familiar materials, supporting aesthetic goals, and making electronics more attainable.
Scissor mechanisms are commonly used extension mechanisms for developing lifts, robotic grippers, and mechanical shape-changing toys. The scissor mechanism has several unique features when applied to shape-changing interfaces, namely (1) simple mechanism with 1DoF transformation, (2) expandable transformation capability for interaction design, and (3) modular and customizable linkage design. In this paper, we present Xs, a novel type of shape-changing interface based on scissor mechanisms. The architecture design of Xs is introduced to construct a range of configurations based on the concept of global and local segment modules. Our implementation introduces modular prototypes that allow rich geometric configuration and I/O customization for users/designers to construct different transforming, interactive systems. Based on the prototypes, we present a variety of applications such as a shape-changing gaming controller, scalable and adaptable sensing, and mobile attachments.
Multi-sensory experiences underpin embodiment, whether with the body itself or technological extensions of it. Vocalists experience intensely personal embodiment, as vocalisation has few outwardly visible effects and kinaesthetic sensations occur largely within the body, rather than through external touch. We explored this embodiment using a probe which sonified laryngeal muscular movements and provided novel auditory feedback to two vocalists over a month-long period. Somatic and micro-phenomenological approaches revealed that the vocalists understand their physiology through its sound, rather than awareness of the muscular actions themselves. The feedback shaped the vocalists’ perceptions of their practice and revealed a desire for reassurance about exploration of one’s body when the body-as-sound understanding was disrupted. Vocalists experienced uncertainty and doubt without affirmation of perceived correctness. This research also suggests that technology is viewed as infallible and highlights expectations that exist about its ability to dictate success, even when we desire or intend to explore.
Machine learning (ML) provides designers with a wide range of opportunities to innovate products and services. However, the design discipline struggles to integrate ML knowledge in education and prepare designers to ideate with ML. We propose the Mix & Match Machine Learning toolkit, which provides relevant ML knowledge in the form of tangible tokens and a web interface to support designers’ ideation processes. The tokens represent data types and ML capabilities. By using the toolkit, designers can explore, understand, combine, and operationalize the capabilities of ML and understand its limitations, without depending on programming or computer science knowledge. We evaluated the toolkit in two workshops with design students, and we found that it supports both learning and ideation goals. We discuss the design implications and potential impact of a hybrid toolkit for ML on design education and practice.
The spectrum of applications for social drones is broadening as they become an increasingly accessible technology. In order to expand on the immensely rich but poorly researched field of Human-Drone Interaction (HDI), we present a minimal, explorative, and anti-solutionist design. We describe the first steps of a Research through Design (RtD) project focused on the concept-driven exploration of an unlikely pairing: drones and breathing. We present Wisp, a micro-drone probe controlled by a user’s breath. Informed by experts on breathing, drawing inspiration from soma design, Wisp is described as platform for the development of defamiliarising views towards intimate somatic interactions between humans and drones. In this paper we describe the initial studies in a RtD development process, including expert interviews, prototyping, and informal evaluations. We contribute to the field of HDI with a design composite framework combining soma design and slow technology for exploratory somatic slow interactions between humans and drones.
Investigating a Force-Based Selection Method for Smartwatches in a 1D Fitts’ Law Study and Two New Character-Level Keyboards
Selecting small targets is difficult on tiny displays due to the “fat-finger problem”. In this paper, we explore the possibility of using a force-based approach to target selection on smartwatches. First, we identify the most comfortable range of force on smartwatches. We then conduct a 1D Fitts’ law study to compare the performance of tap and force-tap. Results revealed that force-tap is significantly better in selecting smaller targets, while tap outperforms force-tap for bigger targets. We then developed two new force-based keyboards to demonstrate the feasibility of force input in practical scenarios. These single-row alphabetical keyboards enable character-level text entry by performing slides and variating contact force. In a user study, these keyboards yielded about 4 wpm with about 2% error rate, demonstrating the viability of force input on smaller screens.
ShiftTouch: Extending Touchscreens with Passive Interfaces using Small Occluded Area for Discrete Touch Input
We present ShiftTouch, an attachment-type passive interface that provides multiple inputs for capacitive touchscreens with minimal screen occlusion. ShiftTouch utilizes multiple linear electrodes to control the fine displacement of the touch position. The touch input is activated under the electrodes when several adjacent electrodes are grounded simultaneously. Each input area shares several electrodes with neighboring input areas, and the touchscreen identifies each one by detecting the fine displacement of the touch position. ShiftTouch can effectively reduce the occlusion area while inheriting the advantages of existing touch extension interfaces, which are battery-free, freely detachable, and easy to construct. Depending on the number of inputs to implement, ShiftTouch can alleviate screen occlusion by up to 80.5% compared to existing approaches using finger-sized electrodes.
Embodying Wind through Flare: How Natural Phenomena Can Contribute to Enriching the Design of Interactive Systems
Since the divide between nature and society, humans have objectified nature. This continuing separation has corrupted the relationship with our natural habitat. While various design research projects bring elements of nature into our cultivated environment, most incorporate nature as a separate entity. Although integrative collaborations with nature can be found in interactive projects designed from non-anthropocentric perspectives, they are not specifically aimed to connect humans with nature. We argue that the same forces that shape our natural habitat allow us to be closer to nature and propose the participation of natural phenomena as strong concept for the design of interactive systems. To illustrate this concept, we present Flare, a silk dress embroidered with dandelions made up of LED's which react to wind movements. With Flare as an evocative object, we draw parallels between existing theories and explore opportunities for designers to create interactions which help us to reconnect with nature.
Pandemic lockdowns created new barriers for HCI researchers, but also provided new opportunities for deeper engagement and reflection in our home environments. Five participants were introduced with a design brief on self-isolation and engaged 12 of their friends and family in the design process of in-the-isolated-wild deployments. By analysing the design process, we found that –while ‘making from home’– our participants noticed the subtlety of the interactions and materials, the processes of remembrance embedded in craft, the use of imperfection and metaphor in homeware, and how ambient presence can provide emotional support. We then conducted a follow-up study on the benefits and limitations of using a crafting approach while ‘making from home’ and discuss the tensions that novices experience while designing TUIs in such an environment. Our results expand the literature by highlighting the benefits, limitations, and trade-offs of user-led design, DIY user empowerment, and harnessing the power of craft.
Geppetteau: Enabling haptic perceptions of virtual fluids in various vessel profiles using a string-driven haptic interface
What we feel from handling liquids in vessels produces unmistakably fluid tactile sensations. These stimulate essential perceptions in home, laboratory, or industrial contexts. Feeling fluid interactions from virtual fluids would similarly enrich experiences in virtual reality. We introduce Geppetteau, a novel string-driven weight shifting mechanism capable of providing perceivable tactile sensations of handling virtual liquids within a variety of vessel shapes. These mechanisms widen the range of augmentable shapes beyond the state-of-the-art of existing mechanical systems. In this work, Geppetteau is integrated into conical, spherical, cylindrical, and cuboid shaped vessels. Variations of these shapes are often used for fluid containers in our day-to-day. We studied the effectiveness of Geppetteau in simulating fine and coarse-grained tactile sensations of virtual liquids across three user studies. Participants found Geppetteau successful in providing congruent physical sensations of handling virtual liquids in a variety of physical vessel shapes and virtual liquid volumes and viscosities.
As an emerging field in HCI, Digital Craft is often involved in debate on its concept that leads to distinctive practices. In this paper, the authors argue that the hybridization of digital power as computing and fabrication and human skills of ideation and hand-making sheds light on an important research direction for future inquiry of digital craft. In particular, the reported project focuses on bamboo craft making to explore the potentiality of digital craft through constructive design research methods. Through the collaborative making with craftspeople, methods of hybridizing digital power and human skills are explored by decoding and recoding the bamboo making process with inventive digital intervention, which enriches bamboo artifacts’ forms and integrates digital fabrication, such as 3D printing. Meanwhile, digital toolkits for bamboo weaving and a digital platform that can make computational design and craft more compatible are created and reported. The authors conclude with reported hybrid craft cases that the hybridization has the potential to stimulate a new generation of artisans.
Travel in time is possible. We do it every day by recollecting past experiences, called episodic memories. Episodic memories, being a key in defining self-identity, are very sensitive, especially to the effects of aging and dementia. What are the ways to relive our experiences, to make them brighter? While much HCI work has been dedicated to developing memory aids, it has not looked into the direction of physicalizing episodic memory by means of vibration. We present a method to map past experiences to specific vibrotactile patterns. The method includes in-depth interviews, materialization, and co-design sessions. As a result, eight vibrotactile patterns were co-designed to represent four positive and four negative memory episodes. We finalize by reflecting on the process, commonalities in mappings, and future work. We hope our research insights will inspire novel methods and technologies to give tangible forms to memorable experiences.
Sentinel Species: Towards a Co-Evolutionary Relationship for Raising Awareness About the State of the Air
Interactive technologies are increasingly being used as discursive objects for raising awareness about the environment in the cultural sector, but little is known about the user’s lived experience during an interaction. In this study, we present the development and evaluation of an interface designed to raise awareness about the environment within a speculative art installation. For this purpose, we drew on the concept of sentinel species, specifically the miner’s canary, to enable a multisensory experience with the state of the air. We then evaluated the interface with 14 participants while interacting in a prototypical arrangement in the laboratory. Overall, the findings indicate promising directions towards a sentinel-species-mimicking interface that communicates the state of the air through its physiological behavior and thus also engages with the user’s empathy on a cognitive and emotional level. Based on the findings, we highlight the implications of this study and point to further directions for human–atmosphere interactions.
To Touch or not to Touch? Differences in Affordance Resonating with Materialities. Hard and Soft Sensors embedded in an Artistic Research Setting
In this paper, we present theoretical basics, creative development processes, and partial evaluation results of human behavior and attitude in a dramaturgically staged interactive environment. The installation is set up in a media lab and the separate rooms are carefully designed to provoke emotional and cognitive human reactions. The corresponding installed sensor-actuator system includes different types of embedded sensors and tangible interfaces to create a fully embodied experience for participants walking through the artistic research facility. We evaluate and compare two opposite design approaches to investigate the impact of dramaturgy, design strategies, and furnishing on affective human-machine correlations and appropriation processes.
Effective privacy protection in dynamic UbiComp environments requires users to be able to manage their privacy seamlessly across diverse contexts. To support this, designers need to go beyond GUI-based interactions and utilise tangible and embodied interactions. To help designers in such endeavours, we present the TTP toolkit: a card-based ideation kit to generate designs for tangible privacy management tools. The toolkit translates the Privacy Care framework for tangible-supported privacy management into a game intended to support designers in developing TUI privacy management tools. We demonstrate use of our toolkit through 10 online participatory workshops with 22 interaction designers. Our results demonstrate that the toolkit was effective in supporting the participants to creatively and collaboratively generate meaningful conceptual designs of tangible tools for privacy management.
Body maps are a popular tool in body-centric design, facilitating a sensitization and expression of felt sensations and emotions. Yet, they also bring forth assumptions about the body and our somatic experience. Based on an open and exploratory design ideation inquiry, we have started to explore how body maps could be advanced so as to cater to a plurality of bodies and aspects that shape somatic experiences. We present an annotated portfolio featuring six design themes (temporality, sociality, representativeness, granularity, context, focus). These themes help us examine implicit assumptions of current body maps, and offer possible alternatives for what future body maps could become. We contribute our themes, inspirational design ideas and practical design techniques to help craft novel body maps. Our contributions can serve as inspiration to others, towards advancing body maps as a research tool for body-centric interaction design.
THE CREATION OF A HOLISTIC INTERACTIVE DINING EXPERIENCE WITH SHAPE-CHANGING FOOD MATERIALS AT RESTAURANT ALCHEMIST
Various studies demonstrated how to enrich edible materials with interactivity and shape-change using digital technologies such as 3D printing. We illustrate a collaborative and iterative process of developing a dish for two-Michelin starred restaurant ALCHEMIST in Copenhagen involving designers, material scientists, and chefs. They combined expertise in creative technologies, material behavior, and culinary arts to create novel dining experiences. Through a playful experience workshop, participants generated ideas to incorporate shape-changing food materials into their dishes. Digital prototyping technologies combined with material experiments resulted in an edible flower that through transforming petals becomes a tartelette when in contact with low or high pH solutions. We contribute a vision on how interactive dining experiences can be enriched with material-driven explorations, digital technologies, and experience design methods to create novel opportunities for human-food interaction. We reflect on the role of interaction designers when collaborating with chefs, and material scientists towards achieving this vision.
The conversational nature of sketches is a widespread topic of research. Understanding drawing as a cognitive activity is commonly accepted, and many of the most extensively used methods within Human-Computer Interaction recruit sketching as a technique for ideation, explanation, documentation, and conversation. To further develop the use of this illustration process as a tool of knowledge production, we suggest a novel sketching method. We present Conversational Composites: a flexible method grounded in the material and tangible qualities of sketching in different forms and media, creating physical and digital layers of conversation between participants. We present and reflect on the proposed method through an applied case of a conversation between a PhD student and her supervisor, and offer suggestions on how it may be adapted and appropriated by other researchers in the HCI community.
The fast-paced lifestyle and the conveniences of urban food storage contribute to an increase in domestic food waste, wherein we end up not consuming everything that we buy. This issue has been tackled within HCI through different awareness tools; however, the design of domestic food storage in itself has received limited attention from designers. We present FoodChestra, a smart open pantry that displays perishable food items in shared households. FoodChestra supports multimodal interactions and offers timely feedback to help users understand and reflect on their shared shopping and eating practices. In this pictorial, we present the key design decisions that were undertaken to develop the five main components of FoodChestra. Through this work, we aim to inspire new design thinking for reimagining the food storage systems of urban households that can encourage people to reflect on their food consumption practices.
Exploring the Embodied Experience of Walking Meetings through Bodystorming – Implications for Design
Walking meetings are a promising way to reduce unhealthy sedentary behavior at the office. Some aspects of walking meetings are however hard to assess using traditional research approaches that do not account well for the embodied experience of walking meetings. We conducted a series of 16 bodystorming sessions, featuring unusual walking meeting situations to engage participants (N=45) in a reflective experience. After each bodystorming, participants completed three tasks: a body map, an empathy map, and a rating of workload using the NASA-TLX scale. These embodied explorations provide insights on key themes related to walking meetings: material and tools, physical and mental demand, connection with the environment, social dynamics, and privacy. We discuss the role of technology and opportunities for technology-mediated walking meetings. We draw implications for the design of walking meeting technologies or services to account for embodied experiences, and the individual, social, and environmental factors at play.
INTERACTIVE STAINED-GLASS: Exploring a new design space of traditional hybrid crafts for novel fabrication methods
Stained-glass is a craft with a wealth of opportunities that blends seamlessly into our everyday environments. Despite sharing similar tools and materials with other types of hybrid crafting, authentic stained glass is underexplored in HCI. We introduce stained-glass to TUI researchers, explain the fabrication process thoroughly (using the traditional methods of both copper foil and lead came), and explore its potential as a conductive substrate for interactivity. We contribute fabrication techniques to support various circuit connection traces, light diffusion methods, interactivities, and aesthetic qualities. We also introduce three potential applications as proof of method validity in different contexts. We follow this with a discussion on experiential outcomes and the importance of creative practices in the development of interactive artefacts.
My Body, My Baby, and Everything Else: An Autoethnographic Illustrated Portfolio of Intra-Actions in Pregnancy and Childbirth
I have been interested for many years in technology and its impact in everyday moments, I had not yet had the chance to critically and systematically find a coherent and self-contained experience to focus on. In March 2021, I bought a digital pregnancy test for the first time, and an autoethnographic journey started. It was the first year of my PhD and I was expecting my third child. In this pictorial, I offer an illustrated and annotated portfolio of my pregnancy, from test to birth, with an emphasis on the technology entangled in the stories. Framed by Agential Realism, I identify the agential cuts in the illustrations. I conclude with an appeal for annotated portfolios of intra-actions, and for other HCI researchers to share their own socio-technical assemblages around fertility, pregnancy, and childbirth. I use my account of this process as a step towards making the intra-actions in pregnancy and childbirth a matter of care for the TEI community.
The development of smart textiles with interactive capabilities has introduced new ways to embed functional fibers, electrically active materials, and electrical components within textiles. Nevertheless, this ubiquitous integration will bring challenges regarding materials resources and their disposal. To seek bio-based alternatives for the current unsustainable components, the present work explores the integration of carboxymethyl cellulose (CMC) optical waveguides into yarn structures to create textile-based optical sensors.
The pictorial visually showcases the development process of initial proof of concept samples framed on the double-diamond methodology for design innovation. Following an interdisciplinary approach, the research was conducted by combining methods from practice-based design research and empirical material science. As a result, we present novel bio-based woven smart textiles demonstrating their touch optical sensing capabilities and discuss their potential for sustainable smart textile development.
We propose Theory Instruments, a novel kind of tangible tool, to bring perspective shifts and new vocabulary into design projects and organisations. They aim to bridge disciplinary gaps among practitioners. In this pictorial, we present six Theory Instruments with instructions in cutouts, so practitioners can put them into action. To test the Theory Instruments, we conducted workshops with ten companies from various sectors. With the tangibles, practitioners investigated the interconnectedness of their products, the use contexts, and user experiences. They also unfolded internal interaction design practices. We contribute to Design Anthropology by mapping anthropological theories onto material structures that increase sensitivity to human practices.
A rising number of HCI scholars have begun to use materiality as a starting point for exploring the design's potential and restrictions. Despite the theoretical flourishing, the practical design process and instruction for beginner practitioners are still in scarcity. We leveraged the pictorial format to illustrate our crafting process of Puffy, a bio-inspired artifact that features a cilia-mimetic surface expressing anthropomorphic qualities through shape changes. Our approach consists of three key activities (i.e., analysis, synthesis, and detailing) interlaced recursively throughout the journey. Using this approach, we analyzed different input sources, synthesized peers’ critiques and self-reflection, and detailed the designed experience with iterative prototypes. Building on a reflective analysis of our approach, we concluded with a set of practical implications and design recommendations to inform other practitioners to initiate their investigations in interactive materiality.
While nature can benefit people both mentally and physically, contemporary society has become increasingly disconnected from nature. To rebuild a stronger connection with nature in our everyday life, we introduce FloraWear, a do-it-yourself, wearable living interface, that enables people to easily and closely connect with plants. This pictorial introduces how knowledge is built and shared with others using hybrid craft and fabrication, illustrates the material experiments and design development for FloraWear, and discusses how it affects wearers. Then we summarize how FloraWear can help catalyze a shift in people's perspectives towards nature. By developing emotional ties to their wearable plants, FloraWear wearers begin to understand that both they and their plants are part of an ecosystem.
This pictorial explores the processes and methods of designing for non-human animal users and human visitors through the example of Nature Scenes. Nature Scenes are a set of urban outdoor interspecies shelters and feeding stations designed by us, the Interaction Research Studio, commissioned by Jane Withers Studio for Brompton Design District 2019, as part of London Design Festival (LDF). In this pictorial we document the various steps of designing, prototyping and deploying the ‘scenes’, and examine the animal-human relationship in context of product and interaction design with the intention to advance the current discourse on interspecies and post-human practice-based design research. We unpick the complexities and contradictions that come with designing for animals as well as for humans, and introduce suggestions of how Post-Human Computer Interaction (PHCI) can be approached going forward.
Though a taboo topic, women's masturbation is the most effective technique in producing orgasms among all sexual behaviors . This project explores how somaesthetic interaction design can contribute to designing for women's sexual pleasure, challenging androcentric discourses on women's sexuality, and also the desexualization of women with dis/abilities. In the study, the first author, who identifies as a woman with an invisible disability, experiments with other women's masturbatory techniques using her own body as a design resource. She then articulated that intersubjective engagement using her own body as an artistic medium in the form of Embodied Embroidery, a practice inspired by women's artmaking, and which seeks to foreground the aesthetic dimensions of experiential knowledge to support theory-making in design. Guided by three key features of somaesthetic interaction—first-person perspective, intersubjectivity, and articulation—this pictorial contributes to pleasure activism in the domain of HCI and interaction design.
“Periodic Fable Discovery” Using Tangible Interactions and Augmented Reality to Promote STEM Subjects
This pictorial presents Periodic Fable (PF), an educational game's design and graphic interface that promotes a constructivist approach to engage young children with Science, Technology, Engineering, and Math (STEM) subjects. The game presents children with scientific content supported with an exploratory activity using physical cubes manipulable through Tangible Interaction and Augmented Reality. The game's objective is to entertain children while engaging them with the basics of chemistry and the Periodic Table. We reflect upon the combination of these immersive technologies, game-play mechanics, and aesthetics geared towards conveying accurate scientific information through a ludic and entertaining approach. The quantitative and qualitative results of a study with 20 children, showed significant positive results in the participants’ learning outcomes and engagement, thereby encouraging us to continue evaluating our design system as a tool that can promote STEM Education.
We present an interactive breastplate grown and fabricated from SCOBY (Symbiotic Culture Of Bacteria and Yeast) biofilm over the course of 13 weeks. Challenging the fail-fast and rapid prototyping trends that inhabit HCI research, we instead explore what it means to design at the pace of another living organism. To create our wearable, we combined DIY-Bio knowledge with digital fabrication methods and traditional crafting techniques in order to tune aspects of the SCOBY such as strength, flexibility, shape, color, and electrical conductivity. We then embedded sensors and LEDs within the SCOBY to create a wearable that visually signals based on touch interactions. We demonstrate the interactivity of the breastplate in an everyday context, where differing light responses result from the wearer being hugged, tapped, or brushed. Lastly, we analyze the biodegradability of the SCOBY breastplate and observe the limitations and opportunities of SCOBY as a grown microbial interface.
Aparticipatory data physicalization, or PDP, is a physical visualization that allows people to physically participate in the creation of the visualization, by directly encoding their data. PDPs offer a way to engage a community with data of personal relevance that otherwise would be intangible. However, their design space has only begun to be explored, and most prior work shows relatively simple encoding rules. Therefore, we present a design exploration of different ways of faceting contributed data: by week, day, or person. Specifically, we designed Edo, a PDP that allows a small community to contribute their data to a physical visualization showing the climate impact of their dietary choices. In this pictorial, we elaborate on the design process of Edo and reflect on the findings of a three-week deployment of different layouts of Edo. Fabrication instructions and data are available under CC-BY 4.0 at osf.io/q5fr6.
Fashion is driven by a narrative, i.e. a story or idea that the designer wants to convey to the audience. Fashion-tech now adds another dimension to this narrative through dynamically changing aspects of the garments. Many factors of presentation in a runway show affect how fashion-tech garments communicate a story to the audience. In this pictorial, we review a set of twenty-eight storytelling fashion-tech garments. We identify, catalogue, and categorize the factors designers used to convey stories to the audience from the runway. The design space consists of three levels: (1) the artifact-level, (2) the viewer-level, and (3) the context-level. The design space addresses how designers show their story through fashion-tech garments and how audience members see and know those messages. Our work contributes a list of considerations that fashion-tech designers must address early in their design process to effectively convey their story during a runway show presentation.
SESSION: Work in Progress
Embedded devices are now commonplace, and hardware prototyping toolkits have become a popular approach for hobbyists and professionals to create embedded hardware prototypes. However, moving from prototype into small scale manufacture use introduces complexity and cost, restricting embedded device development ’beyond the prototype’. Challenges include the need to design custom PCB for manufacture, and the design and fabrication of a device enclosure to ensure the robust enough for deployment. In response, we present MakeDevice : a web-based tool that leverages an existing modular hardware prototyping platform, Jacdac, to enable low-complexity route to generate a custom ‘carrier’ PCB upon which modules can be mounted and electrically connected. MakeDevice also automatically generates CAD files for custom enclosures with apertures to suit. We show how such enclosures can be generated using 3D printing and 2D stencils. In this way, MakeDevice lowers the barriers in moving from prototype to viable low-volume deployment of embedded hardware.
This paper aims to break the boundary between VR and IoT devices by creating interactions between 2D screens and the 3D virtual environment. Since headsets are expensive and not accessible to everyone, we aim to create a lower boundary for people to take part in the VR world using their smartphones. Existing research mostly focuses on creating new haptic devices for VR, we instead leverage existing IoT devices that people already own, such as smartphones, to make VR technologies more accessible to multiple IoT users. There are two parts in our project: azimuth detection and communication between the VR environment and IoT devices.
Sensory extensions enhance our awareness by transforming variations in stimuli normally undetectable by human senses into perceivable outputs. Similarly, interactive simulations for learning promote an understanding of abstract phenomena. Combining sensory extension devices with interactive simulations gives users the novel opportunity to connect their sensory experiences in the physical world to computer-simulated concepts. We explore this opportunity by designing a suite of wearable sensory extension devices that interface with a uniquely inclusive PhET Simulation, Ratio and Proportion. In this simulation, two hands can be moved on-screen to various values, representing different mathematical ratios. Users explore changing hand heights to find and maintain ratios through visual and auditory feedback. Our sensory extension devices translate force, distance, sound frequency, and magnetic field strength to quantitative values in order to control individual hands in the computer simulation. This paper describes the design of the devices and our analysis of feedback from 23 high-school aged youth who used our designs to interact with the Ratio and Proportion simulation.
Children with cerebral palsy (CP) can experience complex gait deviations and need to go through intensive lower extremity rehabilitation exercises to develop and enhance their motor control in daily living. However, most of them cannot persist in the regular repetitive exercise sessions using hospital-based equipment. To provide a playful and attractive rehabilitation environment, an interactive carpet with interchangeable covers and varied step lengths is introduced to motivate children for lower extremity training. The vibrant colours, engaging games, visual and audio feedback are designed to increase the carpet-human interaction. This carpet can support gait exercise with five types of step lengths, which improves its accessibility and usability for children with CP.
Design-Based Learning (DBL) is a promising learning approach to nurture 21st century skills. It requires students to leverage social regulation of learning. However, students in elementary education still need to develop these skills. Tangible User Interfaces might help students move up in the developmental trajectory through providing scaffolds and supporting positive interdependence. In this paper, we present the considerations and design of a tool, TangiTeam, that aims to support social regulation during elementary school DBL activities. We hope to inspire teachers and designers to create scaffolds for social regulation.
Smartphones have limited input vocabularies compared with PCs. In this paper, we propose a method for estimating the stylus angle of a passive brush from capacitive images. The technique allows us to expand the input vocabulary without complicating the smartphone or the stylus. We conducted experiments that showed the estimation error (MAE) was 6.63° for pitch and 5.97° for roll. Two applications were implemented to show the feasibility of the technique.
Bringing Movement to Digital Tasks at the Office: Designing an Acceptably Active Interface Interaction for Sending Emails
While working on a computer typically involves sitting for prolonged periods of time, sedentary work routines are associated with numerous health issues. To address this societal concern, existing solutions trigger physical activity as a break from work, rather than a part of it. In this research, we explore a vision for physically active ways of working, by transforming mundane digital tasks into physically active ones. As a research artifact and design exemplar, we present A2-I2, an innovative tangible system for sending emails. After loading their email onto a physical “letter” token, office workers must walk to a physical mailbox located in the office space. Understanding what design qualities influence the experience and acceptability of such systems is a necessary step toward the design of acceptably active interface interactions. We report on a preliminary user test with 8 participants. With this project, we aim to inspire future tangible and embodied systems addressing the timely issue of sedentary behavior at work.
As human beings, we are designed to be empathic, to connect emotionally with each other and to share affection. This is part of our nature and makes us feel like part of a group. New forms of technology enable us to share, in real time, a variety of data. Some data, such as those on heart rate, can relate to our affective state. This research focuses on asking if technology-mediated interactions might help us to feel more emotionally engaged and intimately connected, especially when sensorial feedback about our inner state is provided. In this context, we present an interactive device that provides two users, through pulsing lights, with feedback on their heartbeats and the related level of synchrony (fig. 1). The prototype could be used to assess the effectiveness of this technology and to improve the feeling of connectedness and intimacy between two users.
Hand rehabilitation aims to improve patients’ hand and arm skills, improve adherence to training and increase their participation in activities of daily living (ADLs). A novel way of achieving this is to employ ADL-based interactive rehabilitation tools and show patients how their improved skills can be transferable to daily tasks. Hence, in this paper, we report the results of a set of studies carried out with six healthy individuals and two physiotherapists to discover the potential of integrating ADLs into interactive hand rehabilitation tools. Consequently, we designed two interactive drinking-based concepts and tested those with three stroke patients. We found that ADL-based training couples particularly well with functional training. Still, selecting appropriate functional exercises that match the ADL is an essential task to transfer training outcomes to a functional setting. Based on our findings, this paper highlights that ADL-based interactive hand rehabilitation training must minimally deviate from the original ADLs.
Facebook Data Shield: Increasing Awareness and Control over Data used by Newsfeed-Generating Algorithms
Social media platforms newsfeeds are generated by AI algorithms, which select and order posts based on user data. However, users are often unaware of what data is collected and employed for this aim, neither can they control it. To open up discussions on what data users are willing to feed the newsfeed algorithm with, we created the Facebook Data Shield, a human-size interactive installation where users can see and control what type of data is collected. By pressing buttons, data categories and/or data variables can be (de)activated. An outer rim with lights gives feedback to users about the level of personalization of the resulting newsfeed. We performed a preliminary study to get insights into what data users are willing to share, their preferred level of control, and the effect of such an installation on users’ awareness. Based on our findings, we discuss implications for design and future work.
Free-hand mid-air Qwerty enables entering text in virtual reality without the use of controllers. However, it is much slower and more error prone than its physical counterpart, primarily due to the absence of haptic feedback and reduced spatial awareness. In this paper, we design three different ultrasonic haptic feedback for mid-air Qwerty: feedback only on keypress, on both touch and keypress, and gradual feedback that increases intensity as users push down a key. In a pilot study, the touch & press feedback performed significantly better both quantitatively and qualitatively. We then compare a mid-air Qwerty with and without touch & press feedback in a user study. Results revealed that haptic feedback improves entry speed by 16% and reduces error rate by 26%. Besides, most participants feel that it improves presence and spatial awareness in the virtual-world by maintaining a higher consistency with the real-world, and significantly reduces mental demand, effort, and frustration.
Acroyoga practitioners either follow predefined choreographies or play freely to create compositions of two bodies’ poses and transitions. To support their free play, we created the training technology probe Acrosuit. The Acrosuit gives visual or vibrotactile cues to propose a novel point of contact without instructions how to get there. We invited acroyogis to exploratively play with the suit and gained qualitative insights on whether and how they used the cues. The suit influenced their practice and in some cases also the way of communication between the partners. Experiencing the shifts in communication caused by the Acrosuit made acroyogis more conscious about their interaction without the technology.
This paper introduces a computational composite material comprising layers for actuation, computation and energy storage. Key to its design is inexpensive materials assembled from traditionally available fabrication machines to support the rapid exploration of applications from computational composites. The actuation layer is a soft magnetic sheet that is programmed to either bond, repel, or remain agnostic to other areas of the sheet. The computation layer is a flexible PCB made from copper-clad kapton engraved by a fiber laser, powered by a third energy-storage layer comprised of 0.4mm-thin lithium polymer batteries. We present the material layup and an accompanying digital fabrication process enabling users to rapidly prototype their own untethered, interactive and tangible prototypes. The material is low-profile, inexpensive, and fully untethered, capable of being used for a variety of applications in HCI and robotics including structural origami and proprioception.
While the neck-worn garments are an integral part of fashion around the world, few interactive systems utilize them as form factors. As neck is located beyond the natural field of view, it offers wide opportunities for subtle haptic interaction. While tactile acuity of the neck is well studied in clinical research, these results are difficult to apply for interactive systems. In this paper, we present HaptiCollar - a research prototype for studying tactile acuity of the neck. We conducted two experimental studies comparing different numbers of actuators on the neck, in two contextual scenarios simulating stationary and mobile use. We found that pattern actuations are recognised more effectively than single-point vibrations, independently of use scenario. Our analysis shows that introducing a primary cognitive task and motion does not impair recognition accuracy, while highly increasing the cognitive load. Our work contributes empirical insights for designing haptic interaction for future neck-worn interfaces.
A Method for Controlling the Continuous Transparency of Three-dimensional Objects Utilizing Mechanical Emulsification
Physical interfaces that change appearance by controlling transparency have attracted attention for wide applicability. However, many existing methods are limited to planar objects, and it is difficult to control transparency in three-dimensional objects and to control transparency continuously. In this study, we propose a novel method to control the transparency of three-dimensional objects by utilizing the mechanical emulsification phenomenon of liquid. This approach is to fill the object with two transparent liquids that are liquid-liquid phase separated and agitate the liquids inside the object to make them cloudy and control their transparency. By using electrohydrodynamic (EHD) pumps, which can be freely positioned inside the object, to agitate the liquid, it is possible to control the transparency of complex three-dimensional objects. Continuous control is also achieved by changing the ratio of the two liquids. In this paper, we describe the details of proposed method for controlling the transparency and show application scenarios.
In this paper, we introduce a method to rapidly create 3D geometries by folding 2D sheets via pull-up nets. Given a 3D structure, we unfold its mesh into a planar 2D sheet using heuristic algorithms and populate these with cutlines and throughholes. We develop a web-based simulation tool that translates users’ 3D meshes into manufacturable 2D sheets. After laser-cutting the sheet and feeding thread through these throughholes to form a pull-up net, pulling the thread will fold the sheet into the 3D structure using a single degree of freedom. We introduce the fabrication process and build a variety of prototypes demonstrating the method’s ability to rapidly create a breadth of geometries suitable for low-fidelity prototyping that are both load-bearing and aesthetic across a range of scales. Future work will expand the breadth of geometries available and evaluate the ability of our prototypes to sustain structural loads.
“Hey, can we talk?”: Exploring How Revealing Implicit Emotional Responses Tangibly Could Foster Empathy During Mobile Texting
In face-to-face communication, people can often achieve empathy by understanding the explicit and implicit emotions shared through body-language, word usage, and tone. But in digital communication it becomes difficult to share or identify these (non)verbal cues, especially after ambiguous conversation starters such as “hey, can we talk?” are sent. In this paper, we explore how a tangible artefact could enhance empathy in text-based communication by physically sharing implicit emotional feedback. We conducted two workshops to understand the emotions involved in ambiguous conversation starters and how they can be expressed through an artefact. These workshops resulted in a prototype of a phone case that sends the sender of a text message the initial implicit emotional reaction of the receiver using haptic patterns to inform the receiver how best to proceed. This paper concludes by discussing the nuances of designing for emotions in text-based communication as well as suggestions for future work.
SESSION: Art and Performance
Robots are usually considered tools, devoid of agency and thus creativity. This robotic art performance – Contingent Dreams – explores the generative role of the human artist, algorithm, machine, medium, and environment in art making. In this performance a robot, programmed and assisted by the artist, draws an algorithmic composition in ink. The drawings – derived from recordings of everyday sounds – represent the noise of cities in Louisiana, rendering an ephemeral aspect of the city in unruly mechanical brushstrokes. The drawings come to life as the precise paths of the algorithm are translated into bold, enigmatic brush strokes and anomalous drops of ink. The drawings emerge not just from the imagination of the artist, but also from the sounds of the city, the code of the algorithm, the mechanical motion of the robot, the physics of the brush bristles and ink, and the texture and absorbency of the paper. Each aspect of the performance adds meaningful contingency to the process, resulting in drawings that evoke, as accidents accumulate into meaning, the multiplicity of urban experience.
“A Grain of” is an art installation made by the authors: a small cube (25 x 25 x 25cm) of custom-made matter, sitting on a plinth, emitting sound from within. Its underlying idea is to take a “grain” from a dense, contemporary urban space and translate it into an art installation that makes the city experienceable for the audience. This small cube holds some of the material and sonic ecology of the city from which it stems embedded in it. In this paper, the authors describe the technical processes that they have used to sample and reinterpret the city’s material and sonic ecologies, including Atomic Force Microscopy (AFM), 3d printing, field recording, and sound synthesis. They offer some theoretical notes towards the grain of the city in order to investigate how this process of technical translation produces an experience with affective, poetic, and speculative potential. They argue that this experience lets its audience critically rethink the old enduring binaries between natural and artificial or between the city and nature.
Rejected By My Own Robot: Studying the Potential for Artists to Subvert Technological Expectations Using Critical Design
This paper provides a framework for how artists can use critical design to subvert deeply ingrained expectations around technology. I begin by defining the technological understanding of being as introduced by Martin Heidegger and break down expectations of technology into efficiency, control, and pleasure. I then ask if the burgeoning HCI practice of critical design could be an effective tool for artists in subverting these expectations. My methodology to study this question is to install a provocative object, in my case a disobedient kissing robot, in a public space and analyze participants’ reactions to it using both video and survey analysis. Results show over 70% of users experienced some degree of surprise, friction, or disappointment with the device due to broken expectations about how they assume technology should work. Our study concludes with a discussion of what value there is in challenging the status quo, how artists can take this work further, and what element humor has in provocative critical design.
IIOO IS NOT HERE - Building and exhibiting media art installation with novel modular tangible programming interface: Building and exhibiting media art installation with novel modular tangible programming interface
The installation IIOO IS NOT HERE deals human-machine relations and interfaces. It takes a critical standpoint to subject-object relation and questions of control and elaborates N. Katherine Hayles’ concept of cognitive assemblage in the context of media art installation. The installation is built with newly developed tangible programming interface called Pino. During the exhibition artist will reconfigure the installation and change its behavior. The performative act of artist re-configuring the installation live will highlight the use and possibilities of modular tangible user interface in the context of interactive art installation.
As an experiment designed to question the boundary, relationship, and identity between human bodies and AI robots, Iris is endowed with independent perception and temperament by implementing facial expression recognition, bio-signal measuring, and emotion synthesis. This temperament is stimulated by PAD emotional model and expressed through algorithmic motions generated in real-time. Emotions of the Iris are affected by the wearer's feelings through sensors that measure the biological signals of the wearer. This ability to perceive and emote reflects certain connections and differences with the wearer as if showing a split personality. Moreover, Iris noticeably affects interpersonal interactions and relationships by showing different responses to people captured by the camera. These effects were heightened during COVID-19 when our faces were covered by masks, which is, based on perception and synthesis, the Iris performs like a highly specialized organ to augment and replace humans' expressions.
What would happen if you could catch a sound in your hands? Push or pull it, stretch it out, mix it up, or toss it at the ceiling like a spaghetti noodle? This work explores how children engage with sound in their environment, and as part of their social connection with others, supported by tangible sound toys. By looking at children’s playful processes of listening to and engaging with the isolation of sounds, we developed three sound-based toys that leverage tangible embodied metaphors. This paper presents our sound toy designs, and contributes to a deeper awareness of the affect of sound as a focused component in tangible design, to playfully challenge children to consider how sound impacts their environment, and how they can impact sound.
I Feel You: Exploring possibilities to create touch-responsive woven textiles imitating living beings
I Feel You is a speculative textile design project looking into the possibilities to create multi-sensory electronic textiles that imitate living beings. The theme of textile surfaces imitating living beings has emerged from yearning for touching and closeness: During the global pandemic and the ongoing wave of extinction, I imagined a world where we are increasingly physically separated from each other and other animals. What if, in the future, we are accompanied by robot pets and people? Against this backdrop, this work speculates on how textiles could create an illusion of being close to another living being and being touched by a living creature. The series of textile pieces consists of Jacquard woven multilayered textiles that, when combined with electronics, react to the human touch. The materials include linen, cotton, responsible mohair, wool, and silver-based electrically conductive threads. Together and separately the materials and woven structures strive to create multisensory, touchable worlds. The aim of the study was to discover what kinds of textile surfaces humans can identify through the sense of touch, what kind of touch is experienced as soothing, and how to bring reactivity that imitates living beings into woven textiles. This knowledge was used as a basis to create a series of speculative woven electronic textile pieces that react to touch. Traditional materials and techniques interweave with new technologies creating possibilities to design new types of interactions with textiles. When designing active haptic textile surfaces, traditional properties of textiles, such as materials, patterns, and woven structures cannot be separated from the design process, where all the aspects entangle and affect each other.
InTangible is a mixed media artwork and connected object that captures images of a handwoven fabric and sends them through the network. A digital camera with built-in RGB LEDs, configured with a combination of digital parameters, glides over the fabric and allows remote users to enjoy a new, intimate dimension of the textile. The once static and monochrome texture is transformed into colourful, abstract digital dreamscapes, while, at the same time, becomes a material object meant to be exhibited in a physical space.
With InTangible, we challenge classical art ownership, which only allows few individuals to own a select number of artworks. We believe that art co-ownership would allow a more diverse group of people to own and collect art, which in turn, would change the dynamics of both art creation and engagement with art. Nobody has full ownership of the digital version of the canvas. Instead, it is co-owned by users who acquire fragments of it from an NFT (Non-Fungible Token) market.
Dépaysement means a change of scenery, a unique word in French that captures emotions shared by travelers. The pandemic has changed the entire globe and caused such feelings in many of us. With this project, we created a series of monotypes based on the visual music When Leaving Becomes Arriving we produced before. The process includes 1) toner transfer of digital images, 2) additive or subtractive inking with found objects, and 3) layering and iterations. This workflow supports spontaneous creative impulse and inspires experimentation of ideas and compositions. Transforming screen-based work into tangible media is a refreshing experience. The mixing of ink, the marks from rollers or brayers, the placement of the found objects, and the arrangement of stencils, all contribute to the final output. We hope this project will encourage other computer graphics artists to explore fine art printmaking as an alternative to digital printing.
Strange Loops is a Brain-Computer Interface (BCI) that turns emotional states into colourful lights. Inspired by electric fields from the brain amplified by EEG, the mask reveals and augments the feelings of the wearer. By externalising the inner world, it creates a language of the imperceptible. In this respect, it investigates how wearable technology can create new forms of nonverbal communication, and make the emotional self tangible, as a dynamic Gestalt.
Psychedelic Forms is a series of case studies that explore deep learning (DL) possibilities for creating a tangible form guided by text prompt and 3D object. The selected generated digital 3D objects were manually altered and prepared for 3D printing in ceramics (Figure 1). Glazing happened by hand, often inspired by the AI-generated objects’ vertex colors. And the pouring technique used for painting, it is a metaphor of the thermodynamics of meaning spaces in the latent spaces.
’Psychedelic’ refers to unexpected or unexplored imagination that the human eye has not seen before. Although the original input was well-known ancient sculptures, like Venus and Nymph, the AI model was capable of stylizing the mesh with the inputted text prompt in such a way that the new form was hardly recognizable.
The artwork demonstrates embodied experience and transformation of DL model through artistic practice. To be more precise, by introducing clay as physical material to the process, the destruction and alternation of form happened. The interplay of digital, physical, and chemical processes created new meanings and experiences that are discussed in this article. It is proposed as a meaningful and tangible way of navigating latent space.
As a result, the neuro-avant-garde mixed with artisan techniques and processes offered irregular transformations that contribute to creativity and imagination augmentation. In other words, irregular mutations can lead to new creations that would not happen otherwise.
The goal of this studio is to facilitate a space in which HCI researchers and designers can explore SCOBY (Symbiotic Culture of Bacteria and Yeast), a sustainable biofilm, grown in kombucha tea, that acts similarly to traditional leathers when harvested and dried. While SCOBY is a popular biomaterial in biodesign and DIYBio practices, we aim to introduce SCOBY as a biomaterial for HCI and ground it in sustainable HCI and slow design theory. Participants will then gain hands-on experience with SCOBY through a material exploration phase (e.i., learning how to embed colors, patterns, and electronics) followed by a structured SCOBY application creation phase. Ultimately, the goal of this studio is to give HCI practitioners who are interested in biodesign a space and time to collaborate, create and discuss the opportunities and challenges of kombucha SCOBY as a biomaterial for HCI.
Force feedback and shape change provide unique interaction qualities that can be favourable for the design of more intelligent physical user interfaces, as communication revolves to a large extent around body language and gestures. Currently, there is a dearth of design tools which are needed to accelerate exploration of the design space of haptic and shape change in user interfaces, given the challenges associated with the haptic modality and the expertise that is required for the development of such interfaces. Design tools have the potential to improve the accessibility of these modalities as design material within HCI and facilitate exploration of the design space. In this studio, we will use the design tool Feelix to explore possibilities and opportunities for force feedback and shape change in actuated interfaces.
Tangible devices and interaction in Extended Reality (XR) increase immersion and enable users to perform tasks more intuitively, accurately and joyfully across the reality-virtuality continuum. Upon reviewing the literature, we noticed no clear trend for a publication venue, as well as no standard in evaluating the effects of tangible XR. To position the topic of tangible XR in the TEI community, we propose a hands-on studio, where participants will bring in their own ideas for tangible XR from their application fields, and develop prototypes with the cutting-edge technology and a selection of virtual assets provided. Additionally, we will collectively reflect upon evaluation methods on tangible XR, and aim to find a consensus of a core evaluation suite. With this, we aim to foster a practical understanding and spark new developments in tangible XR and its use cases within the TEI community.
A one-size-fits-all design mentality, rooted in objective efficiency, is ubiquitous in our mass-production society. This can negate peoples’ experiences, bodies, and narratives. Ongoing HCI research proposes design for meaningful relations; but for many researchers, the practical implementation of these philosophies remains somewhat intangible. In this Studio, we playfully tackle this space by engaging with the nuances of soft, flexible, and organic materials, collectively designing probes to embrace plurality, embody meaning, and encourage soma-reflection. Focusing on materiality and practices from e-textiles, soft robotics, and biomaterials research, we address technology’s role as a mediator of our experiences and determiner of our realities. The processes and probes developed in this Studio will serve as an experiential manifesto, providing practitioners with tools to deepen their own practices for designing soma-reflective tangible and embodied interaction. The Studio will form the first steps for ongoing collaboration, focusing on bespoke design and curation of meaningful, personal relationships.
The goal of this studio is to explore the qualities of unhabitual body movements to inform the design of close-to-the-body touch technologies. After engaging with unhabitual kinesthetic activities, we will use visual and textual elicitation tools to communicate emerging felt sensations. We propose the use of photography as an open-ended visual medium and a repertoire of textural metaphors as a textual tool - a vocabulary list of felt qualities that will be extended through the participants’ contribution. We will then collectively explore how these expressions of felt sensations can be translated into concrete design elements via tangible design ideation and making.
The parallel improvements in multi-material 3D printers and the quality of conductive filament open new possibilities for the fabrication of tangible and functional objects. In this studio, we discuss best practices for 3D printed electronics, talk about encountered problems, and derive design recommendations. We will guide the participants through a fabrication process by practically designing and printing objects. Consequently, we contemplate individual functional fabricated components, including small printed circuits and multi-material prints. We aim to spark a discussion about individually experienced challenges participants encountered during their design and fabrication process. This discussion includes problem-solving strategies, whose insights benefit other participants. Finally, we show the potential of printed electronics and discuss encouraging new opportunities in this field.
SESSION: Student Design Competition
Improving Collaboration Experiences and Skills: An Open-ended, User-Driven Self-Tracking Approach for Education
Collaboration is considered an essential skill necessary for work and life in the 21st century. It is hence necessary to support students in developing this skill, whose necessity is amplified by hybrid work and globalization. In this vision, we bring insights and practices from the personal informatics field to the education domain, in order to trigger self-awareness and collective sensemaking. We propose CoSensUs, a physical self-tracking kit for teams of students to track and reflect on their collaboration practices and experiences. We argue for a user-driven, open-ended, playful, and privacy-centered solution, which would track and visualize data on a group level. This original and underexplored focus on group-level tracking also aims to account for the special needs of students, subject to social pressure and potential control in institutional settings. Through this vision, we contribute to the development of essential social and collaboration skills, in an original, playful and inclusive way.
This article intends to explain the project “Unreality”, which is a system developed to make learning remotely and physically more tangibly, socially, and emotionally efficient. Using Augmented Reality (AR) and Extended Reality (XR), interactive holograms will be introduced, and will enable the remotely and physically present students to interact as they normally would. Emotion Tree and Emotion bracelet are designed to encourage more discussions pertaining to their emotional health. Lastly, the Note-Making system takes into account the different learning styles of every student who may not thrive under the traditional learning techniques. The use of Unreality application to input the required data increases ease of access. The report would also explain each element used in the system and its process of how they would be used while learning.
26 February – 1 March 2023 in Warsaw, Poland in University of Warsaw Library and Copernicus Science Centre