Program

The conference will be held on 9-12 February 2020. On these days, we have scheduled the activities outlined below. More details will be added as we get closer to the conference date.

Quickly jump to the following days:

All TEI 2020 proceedings are now available via the ACM Digital Library. Articles listed below do also link directly to the relevant publication within the proceedings.

Best Paper Award

ForceStamps: Fiducial Markers for Pressure-sensitive Touch Surfaces to Support Rapid Prototyping of Physical Control Interfaces

Awarded by Program Chairs

  • Changyo Han (University of Tokyo, Japan)
  • Katsufumi Matsui (University of Tokyo, Japan)
  • Takeshi Naemura (University of Tokyo, Japan)

Honorable Mention Paper Award

Designing With Ritual Interaction: A Novel Approach to Compassion Cultivation Through a Buddhist-Inspired Interactive Artwork

Awarded by Program Chairs

  • Kristina Mah (University of Sydney, Australia & University of Sydney, Australia)
  • Lian Loke (University of Sydney, Australia & University of Sydney, Australia)
  • Luke Hespanhol (University of Sydney, Australia & The University of Sydney, Australia)

Honorable Mention Paper Award

Tangible Music Programming Blocks for Visually Impaired Children

Awarded by Program Chairs

  • Alpay Sabuncuoğlu (Koç University, Turkey & Twin Science and Robotics Lab, Turkey)

Best Pictorial Award

Snap-Snap T-Shirt: Posture Awareness Through Playful and Somaesthetic Experience

Awarded by Pictorial Chairs

  • Svetlana Mironcika (Eindhoven University of Technology, Netherlands)
  • Annika Hupfeld (Eindhoven University of Technology, Netherlands)
  • Joep Frens (Eindhoven University of Technology, Netherlands)
  • Jessica Asjes (Nike Inc.)
  • Stephan Wensveen (Eindhoven University of Technology, Netherlands)

People’s Choice Award Student Design Challenge

Audio Cells: A Spatial Audio Prototyping Environment for Human-Robot Interaction

Awarded by delegate vote

  • Frederic Anthony Robinson (University of New South Wales, Australia)

Best Implementation Award Student Design Challenge

Audio Cells: A Spatial Audio Prototyping Environment for Human-Robot Interaction

Awarded by SDC Chairs

  • Frederic Anthony Robinson (University of New South Wales, Australia)

Best Concept Award Student Design Challenge

Tiglo: Inhabiting Toys Through Expression Mirroring

Awarded by SDC Chairs

  • Cletus V. Joseph (National Institute of Design Gandhinagar Gujarat, India)

People’s Choice Best Talk Award

Tangible Interfaces and Interactions in Sci-Fi Movies: A Glimpse at the Possible Future of TUIs through Fictional Tangible Systems

Awarded by delegate vote

  • Victor Cheung (Simon Fraser University, Canada)
  • Alissa N. Antle (Simon Fraser University, Canada)

Most Somatically Engaged Student Volunteers

Awarded by SV/General Chairs

  • Michelle Pickrell (University of Technology Sydney)
  • Omid Ettehadi (Ontario College of Art and Design University)

Sunday 9 February 2020

All activities on Sunday take place at UTS Building 11.

The Graduate Student Consortium invites graduate students to a day of discussion.

Embodied Atmospheres: A Symbiosis of Body and Environmental Information in the Form of Wearable Artifacts

Graduate Consortium paper

  • Jessica Broscheit (Hamburg University of Applied Sciences, Germany & University of West Scotland, UK)

This paper introduces a Ph.D. research project about wearable computer artifacts that intertwine digital information from both the human body and the environment. These artifacts use content-related sensors to make air constituents perceptible. In addition, the artifacts’ system design is based on metaphorical representations to provide a conceptual system of thought and action for the user. Through the intuitive use of these wearable computer artifacts, participants are able to explore information about themselves and their surroundings with their enhanced body. To provide insights into these human-computer interactions, this Ph.D. research project conducts a user experience study as a ‘real-life’ ethnographic enactment.

The Design and Creation of Tactile Knitted E-textiles for Interactive Applications

Graduate Consortium paper

  • Amy Chen (Hong Kong Polytechnic University, Hong Kong)

E-textiles have the potential to be utilised for their tactile qualities, particularly in sensory stimulation applications. However, tactility and aesthetics are seldomly a focus of e-textiles research, as e-textiles research often concentrates on the functional aspects. While there is research in textiles and e-textiles for sensory stimulation, the research rarely takes advantage of textiles production technologies, frequently using handcraft to produce the sensory tools. While these techniques are accessible, they lack scalability, making creating e-textiles for interactive applications less practical. My work focuses on the design of tactile e-textiles, leveraging the benefits of knit technologies in the production of e-textiles. The work aims to produce a range of knitted textile-based e-textiles which balance design aesthetics, functionality and ease of production. This paper outlines the research that has already been conducted, as well as the planned future work as part of this PhD research.

Deformation Gesture-based Input Method for Non-visual Primitive Geometric Shape Drawing

Graduate Consortium paper

  • Pranjal Protim Borah (Indian Institute of Technology (IIT) Guwahati, India)

Reading and creating graphical information is a difficult task for users with visual impairment and blindness. It becomes even more challenging on touchscreen devices due to the lack of tactile buttons. However, advancement in flexible displays and electronics offers the potential use of physical deformation as an additional input modality. These deformation-based gestures provide innate tactile and kinesthetic feedback, which are essential for non-visual interaction. In this paper, I describe my Ph.D. research on non-visual drawing using the deformation gesture-based input method. This work is currently in the initial phase. This research aims to understand the preference and performance of deformation-based gestures on a smartphone-sized flexible handheld device and evaluate the effect of deformation and touch input modalities in the non-visual primitive geometric shape drawing. The expected outcome of this research can be useful for user interface designers and developers and researchers in developing accessible applications for future flexible devices.

Augmenting Embodied Sensemaking using VR-Enabled New and Unusual perspectives

Graduate Consortium paper

  • Dorothé Smit (University of Salzburg, Austria)

Since the launch of the first Oculus Developer Kit in 2013, consumer and commercial adoption of VR and AR technology has arrived beyond the early-adopter stage. This widespread availability of VR and AR headsets raises challenging and exciting questions for researchers in the field of embodied interaction: how do we design embodied interactions in VR? Can we improve (social) sensemaking beyond the natural body? What new opportunities for embodied interaction have presented themselves, thanks to this new technology, and how can we best use them? To address these questions, my research focuses on designing new interactions with VR systems that go beyond the (digital) gaming context, especially including tangible interactions from new and unusual perspectives, made possible by the new developments in the field of VR technologies. In my thesis, I aspire to present a framework of embodied sensemaking informed by new and unusual perspectives, enabled by virtual reality technology, developed in a Research through Design process.

Metaphors and Technologies for New Tangible User Interfaces

Graduate Consortium paper

  • Beat Rossmy (LMU Munich, Germany)

Tangible user interfaces create the possibility to provide users with an exploratory, expressive and flexible access to musical interaction. However, new technologies and metaphors can additionally open up further opportunities and mental models for the design of new interaction concepts. In the scope of my thesis I explore new metaphors by designing and building new interfaces which finally explore these ideas and concepts. In addition to the conceptual work, the development of new technologies, which are necessary for the implementation of these concepts, is a central part of the work. In this paper I present the prototypes COMB and StringTouch and discuss potential application areas for the implementation of core ideas regarding both concepts.

Level 3 Room 206

Drinks and snacks

Level 3 Foyer

Meal and drinks

Level 3 Foyer

Drinks and snacks

Level 3 Foyer

Our social chairs invite you to join them on a walk around some of the highlights of Sydney, including the Opera House and Harbour Bridge.

For more information and the opportunity to sign up, take a look here: https://tinyurl.com/wlhxg2m

Starting at Martin Place in the CBD. Followed by drinks at Hotel Palisade in The Rocks from 5.30pm onwards.


Monday 10 February 2020

All activities on Monday take place at UTS Building 11.

Soma Design – Intertwining Aesthetics, Ethics and Movement

I will discuss soma design — a process that allows designers to examine and improve on connections between sensation, feeling, emotion, subjective understanding and values. Soma design builds on pragmatics and in particular on somaesthetics by Shusterman. It combines soma as in our first-person sensual experience of the world, with aesthetics as in deepening our knowledge of our sensory experiences to live a better life. In my talk, I will discuss how aesthetics and ethics are enacted in a soma design process. Our cultural practices and digitally-enabled objects enforce a form of sedimented, agreed-upon movements, enabling variation, but with certain prescribed ways to act, feel and think. This leaves designers with a great responsibility, as these become the movements that we invite our end-users to engage with, in turning shaping them, their movements, their bodies, their feelings and thoughts. I will argue that by engaging in a soma design process we can better probe which movements lead to deepened somatic awareness; social awareness of others in the environment and how they are affected by the human-technology assemblage; enactments of bodily freedoms rather than limitations; making norms explicit; engage with a pluralist feminist position on who we are designing for; and aesthetic experience and expression.

Kristina Höök is a full Professor in Interaction Design at Royal Institute of Technology (KTH), Stockholm Sweden. Höök is a frequent keynote speaker, known for her work on social navigation, seamfulness, mobile services, affective interaction and lately, designing for bodily engagement in interaction through somaesthetics. Her competence lies mainly in interaction design and user studies helping to form design. Höök has obtained numerous national and international grants, awards, and fellowships including the Cor Baayen Fellowship by ERCIM (European Research Consortium for Informatics and Mathematics) for her thesis work in 1997, the INGVAR award from the Strategic Research Foundation (SSF) in 2004, she is an ACM Distinguished Scientist since 2014, and she is an ACM distinguished speaker.

Level 0 Room 405

Drinks and snacks

Level 0 Foyer

Session chair: Sara Nabil (Carleton University, Canada)

Awareables: Beyond Wearable Technology

Full paper, 10 min talk

  • James I. Novak (Deakin University, Australia)

Wearable technology is a broad discourse that has evolved over decades, growing into an industry today that represents billions of dollars in product revenue, but stagnating as products become driven by technological developments rather than human needs. However, numerous isolated developments of a new class of wearables are emerging, being both aware of their environment and function, and able to physically adapt to user needs. Examined through this paper collectively, this new class of wearables are described as awareables, representing a shift in technology towards more life-like products. The context of this shift is broadly analyzed alongside similar shifts within the fields of architecture (responsive architecture), additive manufacturing (4D printing) and robotics (evolutionary robotics). The intent of this paper is to encourage new discourse and practical work, calling for researchers and product designers to think beyond gizmos, and instead consider more natural interactions between people and products inspired by nature.

ambienBeat: Wrist-worn Mobile Tactile Biofeedback for Heart Rate Rhythmic Regulation

Full paper, 10 min talk; also presented as a Demo

  • Kyung Yun Choi (MIT, USA)
  • Hiroshi Ishii (MIT, USA)

We present a wrist-worn mobile heart rate regulator — ambienBeat — which provides closed-loop biofeedback via tactile stimulus based on users’ heartbeat rate (HR). We applied the principle of physiological synchronization via touch to achieve our goal of effortless regulation of HR, which is tightly coupled with mental stress levels. ambienBeat provides various patterns of tactile stimuli, which mimics the feeling of a heartbeat pulse, to guide user’s HR to resonate with its rhythmic, tactile patterns. The strength and rhythmic patterns of tactile stimulation are controlled to a level below the cognitive threshold of an individual’s tactile sensitivity on their wrist so as to minimize task disturbance. Here we present an acoustically noise-less soft voice-coil actuator to render the ambient tactile stimulus and present the system and implementation process. We evaluated our system by comparing it to ambient auditory and visual guidance. Results from the user study shows that the tactile stimulation was effective in guiding user’s HR to resonate with ambienBeat to either calm or boost the heart rate using minimal cognitive load.

How are you feeling? Using tangibles to log the emotions of older adults

Full paper, 20 min talk

  • Daniel Gooch (Open University, UK)
  • Vikram Mehta (Open University, UK)
  • Blaine A. Price (Open University, UK)
  • Ciaran McCormick (Open University, UK)
  • Arosha K Bandara (Open University, UK)
  • Amel Bennaceur (Open University, UK)
  • Mohamed Bennasar (Open University, UK)
  • Avelie Stuart (University of Exeter, UK)
  • Linda Clare (University of Exeter, UK)
  • Mark Levine (University of Exeter, UK)
  • Jessica Cohen (Age UK, UK)
  • Bashar Nuseibeh (The Open University, UK & University of Limerick, Ireland)

The global population is ageing, leading to shifts in healthcare needs. Home healthcare monitoring systems currently focus on physical health, but there is an increasing recognition that psychological wellbeing also needs support. This raises the question of how to design devices that older adults can interact with to log their feelings. We designed three tangible prototypes, based on existing paper-based scales of affect. We report findings from a lab study in which participants used the prototypes to log the emotion from standardised emotional vignettes. We found that the prototypes allowed participants to accurately record identified emotions in a reasonable time. Our participants expressed a perceived need to record emotions, either to share with family/carers or for self-reflection. We conclude that our work demonstrates the potential for in-home tangible devices for recording the emotions of older adults to support wellbeing.

Classification of Spontaneous and Posed Smiles by Photo-reflective Sensors Embedded with Smart Eyewear

Full paper, 10 min talk

  • Chisa Saito (Keio university, Japan)
  • Katsutoshi Masai (Keio University, Japan)
  • Maki Sugimoto (Keio University, Japan)

Smile is one of the representative emotional expressions which is observed frequently in daily life and essential for various non-verbal communications. People make spontaneous smiles and intentional ones. It is important to guess properly whether a person is making a smile spontaneously or intentionally to understand the meaning of smiles. In this study, we propose a smile classification system with smart eyewear that equips photo-reflective sensors and examines whether we can distinguish two types of smiles; spontaneous smiles caused by funny videos and posed smiles evoked by instructions. We extract geometric features: reflection intensity distribution of sensors and temporal features in a time axis. By applying for Support Vector Machine, we observed 94.6% as the mean accuracy among 12 participants when we used both geometric and temporal features with user-dependent training. The result suggested that we can distinguish between spontaneous and posed smile by the sensors embedded with the smart eyewear.

Designing for Workplace Safety: Exploring Interactive Textiles as Personal Alert Systems

Full paper, 20 min talk

  • Miriam Gaissmaier (KTH Royal Institute of Technology, Sweden)
  • Anna Madelene Karlsson (Boris Design Studio, Sweden)
  • Sofie Aschan Eriksson (Boris Design Studio, Sweden)
  • Elsa Kosmack (Sics västerås, Sweden)
  • Ksenija Komazec (RISE Research Institutes of Sweden, Sweden)
  • Ylva Ferneaus (KTH Royal Institute of Technology, Sweden)

Despite various safety regulations and procedures, work accidents remain a significant problem in the global process industry and the Swedish steel industry. To address personal safety and safety culture, wearable alert systems were prototyped and tested with steelworkers in iterative workshops. A resulting design concept, in the form of an interactive textile patch worn on the protective gear, suggests a simple way of transmitting personal alerts using light. A crucial design factor identified is to enable the communication between workers and peers as well as communicating with control room staff. The visual design can positively influence the acceptance of the patch, but its impact on the safety culture cannot yet be assessed. The present study contributes by approaching workplace safety and culture with a new design concept of IoT and e-textile technologies based on the interaction modalities of light, sound, and vibration

Everyday Life Reflection: Exploring Media Interaction with Balance, Cogito & Dott

Full paper, 10 min talk; also presented as a Poster

  • Ine Mols (Eindhoven University of Technology, Netherlands & University of Technology Sydney, Australia)
  • Elise van den Hoven (University of Technology Sydney, Australia & Eindhoven University of Technology, Netherlands)
  • Berry Eggen (Eindhoven University of Technology, Netherlands)

Reflection is of increasing interest in HCI as it has many potential benefits in design, education and everyday life. In this paper, we explore media-supported reflection through the design and deployment of three concepts. In contrast to prevalent reflective approaches that are based on system-collected data, we explore how user-created media can support personal reflection. Three interactive prototypes were developed, focusing on different modalities: Balance uses audio, Cogito uses text, and Dott uses visual media. We evaluate these concepts in an in-thewild study that is both explorative and comparative. We found that the open-ended systems primarily supported reflection during the creation of media and that the use depended on opportunity and triggers. We conclude the paper with a discussion of our findings regarding the method and the implications of our findings for the broader area of design for reflection.

Embodying Meaningful Digital Media: A Strategy to Design for Product Attachment in the Digital Age

Full paper, 10 min talk

  • Daniel Orth (University of Technology Sydney, Australia)
  • Clementine Thurgood (Swinburne University of Technology, Australia)
  • Elise van den Hoven (University of Technology Sydney, Australia & Eindhoven University of Technology, Netherlands)

Technological products have become central to the ways in which many people communicate with others, conduct business and spend their leisure time. Despite their prevalence and significance in people’s lives, these devices are often perceived to be highly replaceable. From a sustainability perspective, there is value in creating technological products with meaning directly associated with their materiality to reduce the rate of product consumption. We set out to explore the potential for design to promote the formation of product attachment by creating technological devices with meaningful materiality, closely integrating the physical form with the significance of its digital contents. We used the life stories and ongoing input of our intended user as inspiration for the creation of Melo, a bespoke music player. The evaluation and critical reflection of our design process and resulting artefact are used to propose a design strategy for promoting product attachment within the growing sector of technological devices.

Level 0 Room 405

Meal and drinks

Level 0 Foyer

This event aims to build and strengthen the diverse networks of our attendees and support mentorships across academia, research, and industry for those with traditionally marginalised backgrounds.

Level 3 Room 301

Session chair: Kai Kunze (Keio University, Japan)

Towards a Material Landscape of TUIs, Through the Lens of the TEI Proceedings 2008-2019

Full paper, 20 min talk

  • Sarah Hayes (Cork Institute of Technology, Ireland)
  • Trevor Hogan (Cork Institute of Technology, Ireland)

Materials play an influential role in determining the way people interact with and experience objects. This impact is particularly important to TUI designers, as the artefacts they design often afford grasping and physical manipulation. As the ACM Conference on Tangible, Embedded and Embodied Interaction moves through its second decade, we sought to survey past proceedings to present a picture of the material choices for TUI design. In this paper we present an exhaustive survey of these proceedings and discuss insights revealed on TUI material trends. Our findings include highlighting the most popular materials choices, as well as the high percentage of materials that have only been used once throughout the years. Furthermore, we make recommendations on the future use of, and reporting on materials choice for TUI design and point toward future work that is needed to fully map the material landscape of TUIs.

Foxels: Build Your Own Smart Furniture

Full paper, 10 min talk

  • Florian Perteneder (Tokyo Institute of Technology, Japan & University of Applied Sciences Upper Austria, Austria)
  • Kathrin Probst (University of Applied Sciences Upper Austria, Austria)
  • Joanne Leong (MIT, USA)
  • Sebastian Gassler (University of Applied Sciences Upper Austria, Austria)
  • Christian Rendl (University of Applied Sciences Upper Austria, Austria)
  • Patrick Parzer (University of Applied Sciences Upper Austria, Austria)
  • Katharina Fluch (University of Applied Sciences Upper Austria, Austria)
  • Sophie Gahleitner (University of Applied Sciences Upper Austria, Austria)
  • Sean Follmer (Stanford University, USA)
  • Hideki Koike (Tokyo Institute of Technology, Japan)
  • Michael Haller (University of Applied Sciences Upper Austria, Austria)

Introducing interactive components into furniture has proven difficult due to the different lifespans of furniture and digital devices. We present Foxels, a modular, smart furniture concept that allows users to create their own interactive furniture on demand by simply snapping together individual building blocks. The modular design makes the system flexible to accommodate a variety of interactive furniture setups, making it particularly well-suited for re-configurable spaces. Considering the trade-off between ease-of-use and high versatility, we explored a number of interaction methods that can be applied to modular interactive furniture, thereby extending the well-known tangible programming paradigm. After explaining our implementation, we demonstrate the validity of the proposed concepts by presenting how Foxels can be used in an ideation workshop along with many additional real-world examples.

Orthorigami: Implementing Shape-Memory Polymers for Customizing Orthotic Applications

Full paper, 10 min talk

  • Jessica Berry Reese (Texas A&M University, USA)
  • Jinsil Hwaryoung Seo (Texas A&M University, USA)

Previous work in 3D printing of hand orthotic devices has shown that patients prefer a 3D-printed design over traditional orthotic splints. Despite the possibility to incorporate self-expression when designing for 3D printing, further customization post-printing is not possible. This creates a design iteration process that may require multiple 3D-printed customized orthosis to obtain the final product. Shape memory polymers (SMPs) are unique materials for design due to their origami” nature; that is, their ability to bend when heated. This paper looks at how to harness this ability to create what we propose as Orthorigami: orthotics made using folding techniques. Specifically, the paper explores how to design an aesthetically pleasing, lightweight, simple, easily adjustable, personally customizable Orthorigami device. This paper presents three design cases using Orthorigami. These cases are used as a means to explore the design process of Orthorigami and see if the process provides an improvement on design iteration over its 3D-printed counterparts. The outcome of case studies is used to propose a process that the end-user may use to create a tailored Orthorigami device in a DIY setting. “

TRANS-DOCK: Expanding the Interactivity of Pin-based Shape Displays by Docking Mechanical Transducers

Full paper, 20 min talk

  • Ken Nakagaki (MIT Media Lab, USA)
  • Yingda (Roger) Liu (MIT Media Lab, USA)
  • Chloe Nelson-Arzuaga (MIT, USA)
  • Hiroshi Ishii (MIT, USA)

This paper introduces TRANS-DOCK, a docking system for pin-based shape displays that enhances their interaction capabilities for both the output and input. By simply interchanging the transducer module, composed of passive mechanical structures, to be docked on a shape display, users can selectively switch between different configurations including display sizes, resolutions, and even motion modalities to allow pins moving in a linear motion to rotate, bend and inflate. We introduce a design space consisting of several mechanical elements and enabled interaction capabilities. We then explain the implementation of the docking system and transducer design components. Our implementation includes providing the limitations and characteristics of each motion transmission method as design guidelines. A number of transducer examples are then shown to demonstrate the range of interactivity and application space achieved with the approach of TRANS-DOCK. Potential use cases to take advantage of the interchangeability of our approach are discussed. Through this paper we intend to expand expressibility, adaptability and customizability of a single shape display for dynamic physical interaction. By converting arrays of linear motion to several types of dynamic motion in an adaptable and flexible manner, we advance shape displays to enable versatile embodied interactions.

LiftTiles: Constructive Building Blocks for Prototyping Room-scale Shape-changing Interfaces

Full paper, 10 min talk; also presented as a Demo

  • Ryo Suzuki (University of Colorado Boulder, USA)
  • Ryosuke Nakayama (Keio University, Japan)
  • Dan Liu (University of Colorado Boulder, USA)
  • Yasuaki Kakehi (University of Tokyo, Japan)
  • Mark D Gross (University of Colorado Boulder, USA)
  • Daniel Leithinger (University of Colorado Boulder, USA)

Large-scale shape-changing interfaces have great potential, but creating such systems requires substantial time, cost, space, and efforts, which hinders the research community to explore interactions beyond the scale of human hands. We introduce modular inflatable actuators as building blocks for prototyping room-scale shape-changing interfaces. Each actuator can change its height from 15cm to 150cm, actuated and controlled by air pressure. Each unit is low-cost (8 USD), lightweight (10 kg), compact (15 cm), and robust, making it well-suited for prototyping room-scale shape transformations. Moreover, our modular and reconfigurable design allows researchers and designers to quickly construct different geometries and to explore various applications. This paper contributes to the design and implementation of highly extendable inflatable actuators, and demonstrates a range of scenarios that can leverage this modular building block.

ExpandFab: Fabricating Objects Expanding and Changing Shape with Heat

Full paper, 10 min talk; also presented as a Demo

  • Hiroki Kaimoto (Shonan-Fujisawa, Japan)
  • Junichi Yamaoka (University of Tokyo, Japan)
  • Satoshi Nakamaru (Keio University, Japan & the University of Tokyo, Japan)
  • Yoshihiro Kawahara (University of Tokyo, Japan)
  • Yasuaki Kakehi (University of Tokyo, Japan)

ExpandFab is a fabrication method for creating expanding objects using foam materials. The printed objects change their shape and volume, which is advantageous for reducing the printing time and transportation costs. For the fabrication of expanding objects, we investigated a basic principle of the expansion rate and developed materials by mixing a foam powder and elastic adhesive. Furthermore, we developed a fabrication method using the foam materials. A user can design expanded objects using our design software and sets the expansion areas on the surface. The software simulates and exports the 3d model into a three-dimensional (3D) printer. The 3D printer prints the expandable object by curing with ultraviolet light. Finally, the user heats the printed objects, and the objects expand to maximum approximately 2.7 times of their original size. ExpandFab allows users to prototype products that expand and morph into various shapes, such as objects changing from one shape to various shapes, and functional prototype with electronic components. In this paper, we describe the basic principle of this technique, implementation of the software and hardware, application examples, limitations and discussions, and future works.

Wearable Bits: Scaffolding Creativity with a Prototyping Toolkit for Wearable E-textiles

Full paper, 10 min talk; also presented as a Demo

  • Lee Jones (Carleton University, Canada)
  • Sara Nabil (Carleton University, Canada)
  • Amanda McLeod (Carleton University, Canada)
  • Audrey Girouard (Carleton University, Canada)

Smart garment and wearable e-textile prototypes are difficult to co-design because of the variety of expertise needed (garment design, sewing skills, hardware prototyping, and software programming). To help with this, we developed a toolkit for prototyping wearable e-textiles, named Wearable Bits, which enables co-design with non-expert users without demanding sewing, hardware or software skills. We developed a low-fidelity and medium-fidelity experience prototype of the toolkit and ran a series of workshops where non-expert users designed their own e-textile wearables using Wearable Bits. In this paper, we discuss the ideas they developed, their construction techniques, the roles individuals took on while building, and suggestions for future toolkits.

Level 0 Room 405

Drinks and snacks

Level 0 Foyer

Session chair: Kate Hartman (OCAD University, Canada)

ROOD: Unpacking the Design and the Making of a RoadKill Alert System

Pictorial, 13 min talk

  • Jung-Ying (Lois) Yi (RMIT University, Australia)
  • Rohit Ashok Khot (RMIT University, Australia)

The rapid growth of peri-urbanization has contributed to an increase in road traffic near wildlife, contributing to collisions, injuries and potential deaths of wildlife as well as commuters. This is a significant problem for both animal welfare as well as human roadside safety and limited work has been done to tackle this problem using technology. ROOD is an innovative roadkill alert system, which aims to mitigate the wild animal-vehicle collision by alerting drivers of high-risk areas well in advance. (see Figure 1) This pictorial describes the design and the making of ROOD. We illustrate the challenges encountered in our design process and how they informed our design decisions. Ultimately, our aim is to contribute to safe co-habitation of wildlife.

Affordances Based on Traces of Use in Urban Environments

Pictorial, 13 min talk

  • Linda Hirsch (LMU Munich, Germany)
  • Beat Rossmy (LMU Munich, Germany)
  • Florian Bemmann (LMU Munich, Germany)
  • Andreas Butz (LMU Munich, Germany)

Traces of use in public environments show the behaviour patterns of the masses. Taking advantage of this quality, we want to use such traces as design tool to indicate possible interactions in e.g. newly built areas while keeping a natural and calm environment. Due to current lacking knowledge about such traces, this work aims at understanding the perception of traces of use in public places. Therefore we collected a total of 182 pictures of traces of use in urban environments. A focus group discussed and classified a preselected set of pictures. In an online picture viewing survey, 18 different pictures were reviewed for pattern identification (N= 32-52). Overlaps were visualized in heatmaps. We contribute an analysis of which public traces of use are easy to recognize with great agreement and which are not.

SWAN: Designing a Companion Spoon for Mindful Eating

Pictorial, 13 min talk

  • Rohit Ashok Khot (RMIT University, Australia)
  • Jung-Ying (Lois) Yi (RMIT University, Australia)
  • Deepti Aggarwal (RMIT University, Australia)

In this pictorial, we unfold and reflect on the design process behind the creation of a research product – SWAN. SWAN is an augmented spoon that encourages people to pay more attention to their food and urges them to eat mindfully. With SWAN, our aim is to address the increasing tensions between the lucrative appeal of screen-based media and ideologies of mindful eating. We present a descriptive account on how we brought SWAN into being. In attending to key design decisions across our design process, we unveil ideas and challenges in creating a domestic research product to support everyday mindful eating.

GustaCine: Towards Designing a Gustatory Cinematic Experience

Pictorial, 13 min talk

  • Rohit Ashok Khot (RMIT University, Australia)
  • Jung-Ying (Lois) Yi (RMIT University, Australia)

In this pictorial, we describe the design and making of GustaCine, an engaging crossmodal cinematic experience that allows the viewer to experience and savor cinematic moments through the gustatory pleasures of the different flavored popcorn. Our aim behind creating GustaCine is to explore possibilities in designing food experiences that ‘move’ in parallel with the moving picture. We articulate our making process enumerating the challenges in creating a gustatory experience and undertaken strategies to resolve them. Ultimately, we aim to inspire and guide future research on gustatory cinematic experiences.

Always with Me: Exploring Wearable Display as a Lightweight Intimate Communication Channel

Pictorial, 13 min talk

  • Pradthana Jarusriboonchai (University of Lapland, Finland)
  • Hong Li (University of Lapland, Finland)
  • Emmi Harjuniemi (University of Lapland, Finland)
  • Heiko Müller (University of Lapland, Finland)
  • Jonna Häkkilä (University of Lapland, Finland)

Simple and low bandwidth communication on computers has been found to promote intimacy between couples. In this work, we further explore this minimal communication in the form of wearables. This pictorial presents an in-the-wild concept study of low bandwidth ambient wearable displays as a communication channel between couples. The goal is to understand the contexts in which the technology might be used and provide benefit. Our findings show that simple communications through a wearable device could provide an additional channel for communication. The wearable form factors also creates the feeling of being always connected. We highlight the importance and influence of form factors, contexts, and activities towards user experience. We discuss the opportunities this study opens for the future design of wearable ambient displays.

Modalities of Expression: Capturing Embodied Knowledge in Cooking

Pictorial, 13 min talk

  • Sharon Baurley (Royal College of Art, UK)
  • Bruna Beatriz Petreca (Royal College of Art, UK)
  • Mark Selby (Independent, Netherlands)
  • Paris Selinas (Independent, UK)
  • Martin Flintham (University of Nottingham, UK)

When cooking we negotiate between instructions in recipes and personal preferences to make in-the-moment creative decisions. This process represents moments of creativity that utilise and reveal our embodied knowledge. This paper focuses on the capture of expressions of embodied knowledge by digitally-networked utensils. We present a design process investigating the design of tangible interfaces to capture and communicate embodied knowledge as a proposition for recipe authoring tools for open innovation in food. We reflect upon this process to discuss lessons about the individual nature of embodied knowledge and its expression, and the context of capturing it to make design recommendations.

Level 0 Room 405

Announcement of the Student Design Challenge winners.

PULSY — Because Every Pulse Matters: A Budget-Friendly Pulse Oximeter

Student Design Challenge submission

  • Saish Gosavi (Indian Institute of Technology (IIT) Guwahati, India)
  • Pragati Saraf (Indian Institute of Technology (IIT) Guwahati, India)
  • Arzoo Khare (Indian Institute of Technology (IIT) Guwahati, India)

Diseases come irrespective of income, but treatment and devices don’t. Developing nations like India are faced with the problem of large and poor population that need medical care. ‘Pulsy’ is a low-cost and portable pulse oximeter that is used for measuring one’s heart rate and oxygen saturation in blood. Oximeters are in great demand because conditions like pregnancy, anemia, pneumonia, pulmonary disorders, asthma, and lung cancer demand frequent monitoring of heart rate and oxygen saturation in blood. Maintaining oxygen saturation over 95% in all activities is the goal. Oximeters available in the market are extremely expensive, and people who are unable to purchase them are left with the sole option of making frequent visits to the hospital to get a check-up. This becomes a huge drain on time and money. The objective of ‘Pulsy’ is to simplify the hardware and provide the same result and same accuracy for a substantially reduced cost. Another aspect of ‘Pulsy’ is to be integrated with the Indian fabric of Accredited Social Health Activist (ASHA). The goal is to make the device available to a larger sector of the population so that they are not left unscreened and reducing the burden of manually recording data and making patient data easily and quickly available (to doctors, etc.)

Mixed Reality for Stress Relief

Student Design Challenge submission

  • Ashita Soni (Indiana University Bloomington, USA)
  • Shriyash Shete (Indiana University Bloomington, USA)

Studies have shown that a majority of workers experience stress in the workplace. We envision a product that could help workers relieve stress by making use of Mixed Reality to help them change their immediate vicinity to relieve stress. The design is grounded in our insight from research that people use varying activities to fight stress which helps them in switching context and doing something cognitively less demanding. This is achieved through a wearable headset that would help wearers change the reality in their vicinity at workplace whenever stress levels are high.

Tiglo: Inhabiting Toys Through Expression Mirroring

Student Design Challenge submission

  • Cletus V. Joseph (National Institute of Design Gandhinagar Gujarat, India)

One of the superpowers that children posses is the power of imagination. Imagination can be said to be a tool that aids in cognitive development and intellectual thinking. Imagination comes to the foreground when children play with their toys. Its often seen that children talk to their toys. This is a sign of avid imagination manifesting in their plays. Tiglo aims to let parents to inhabit the toys of the child through expression mirroring. Parents can be part of their child’s play by taking over the expressions of one of their toys and giving the toy a personality similar to their own. Assuming characters and superimposing them with the parents expression and child’s imagination can help the parents better understand the mental state of the child and they can help out the child if the child shows signs of stress.

Facial Image Browsing Simulation for Selfies

Student Design Challenge submission

  • Ralph Kenke (University of Newcastle, Australia)
  • Elmar Trefz (Semnon, Australia)
  • Ardrian Hardjono (Independent, Australia)

Qualitative data such as photo libraries and image archives on smartphones or cloud storage face the challenge of becoming overwhelming and inaccessible to its users, due to their large volume and increasing growth rate of data. Digital tools such as searchable labels or tags can help organise such databases, for example like the use of the hashtag on the social media platform Instagram, which enables the organisation of images by applying personalised tags to it. Images that are shared online retain personal data, which poses, in some cases, ethical challenges. Both the volume of image data and how we can design experiences to view or visualise its content as well as the benefits and risks that such application poses in terms of our personal data are explored in the Participatory Media Installation Selfie Flaneur.

Audio Cells: A Spatial Audio Prototyping Environment for Human-Robot Interaction

Student Design Challenge submission

  • Frederic Anthony Robinson (University of New South Wales, Australia)

Rich and elaborate communication will play an essential part in the success of social robotics. The role of non-verbal sound as a communication channel has received relatively little attention in human-robot interaction research so far. Audio Cells is a prototyping environment for spatial sonic interaction design. It is part of my ongoing work on the sonification of artificial bodies. Through a combination of modular loudspeakers and sensors, it allows the exploration of a range of spatial interactive audio effects. By bringing methods from the fields of sound design and spatial audio into the context of human-robot interaction, my research aims to enrich and refine the ways robotic agents and responsive structures communicate with humans.

Level 0 Room 405

Drinks and snacks

Level 0 Foyer


Tuesday 11 February 2020

All morning activities on Tuesday take place at UTS Building 11. After lunch, activities take place at the University of Sydney.

Panellists

  • Hiroshi Ishii (MIT Media Lab, USA)
  • Caroline Hummels (Eindhoven University of Technology, the Netherlands)
  • Andrew Kun (University of New Hampshire, USA)
  • Elise van den Hoven (University of Technology Sydney, Australia & Eindhoven University of Technology, the Netherlands)

Moderator

  • Jelle van Dijk (University of Twente, the Netherlands)

Before the panel, we have scheduled a talk by Hiroshi Ishii.

This panel will reflect on seminal TEI work and suggest, through conversation with the audience, paths forward – aspirations for TEI practice, design and research. The panellists will discuss the role of vision-driven, and convergence-based research as driving forces for TEI innovations, which foster healthier, more fulfilling, and sustainable lives for individuals, communities, and society.

Level 0 Room 405

Drinks and snacks

Level 0 Foyer

Session chair: Baki Kocaballi (Macquarie University, Australia)

Snap-Snap T-Shirt: Posture Awareness Through Playful and Somaesthetic Experience

Pictorial, 13 min talk; also presented as a Demo

  • Svetlana Mironcika (Eindhoven University of Technology, Netherlands)
  • Annika Hupfeld (Eindhoven University of Technology, Netherlands)
  • Joep Frens (Eindhoven University of Technology, Netherlands)
  • Jessica Asjes (Nike Inc.)
  • Stephan Wensveen (Eindhoven University of Technology, Netherlands)

In this pictorial we present a prototype of a novel personalized garment that provides rich haptic feedback for posture awareness in the context of repetitive strain injury (RSI). Unlike prior work concerned with posture correction, our aim was to design a garment that would allow the user to gain an awareness of his posture with a help of sensorial experiences. Collaboratively engaging the user as a co-designer in movement enactment, movement analysis and embodied co-design sessions enabled us to design a garment that offers posture awareness through playful and somaesthetic experience. We offer a reflective analysis of how our co-design approach enabled us to design for personalization, somaesthetics and playfulness in posture awareness.

Chronic Pain Scales in Tangible Materials

Pictorial, 13 min talk

  • Christina Fyhn (Department of Design and Communication, Denmark)
  • Jacob Buur (Design and Communication, Denmark)

For chronic pain patients, it is a challenge to communicate what their pain feels like – both to friends and relatives and to healthcare professionals. Traditionally doctors employ pain scales (numbers, standardised words, images of facial expressions), but pain scales can be challenging for patients, as chronic pain is experienced individually. They are also difficult to relate to for relatives and professionals, who do not have personal experience of chronic pain. In this pictorial, we present a series of design explorations with tangible materials to offer an alternative for chronic patients to express pain. In collaboration with six patients, we identify eight different types of pain experiences and the material metaphors that the patients may use to express them. We also develop three examples of tailored design artefacts, pain communicators, that can function as ‘tangible pain scales’ to express pain experiences.

Training Body Awareness and Control with Technology Probes: A Portfolio of Co-Creative Uses to Support Children with Motor Challenges

Pictorial, 13 min talk

  • Laia Turmo Vidal (Dept of Informatics and Media, Sweden)
  • Elena Márquez Segura (Department of Informatics and Media, Sweden)
  • Luis Parrilla Bel (Department of Informatics and Media, Sweden)
  • Annika Waern (Uppsala University, Sweden)

Technology probes have been used in co-creative embodied design processes to spur creativity and generate design ideas. We present a range of Training Technology Probes (TTPs), designed to facilitate collocated physical and social training. We deployed them in the context of a physical training course for children with motor challenges, where they were tested and iterated onsite through our participants’ (instructors’ and children’s) creative uses. We report on intended/expected uses as well as improvised ones resulting from creative appropriations that were found useful in our physical activity. We discuss key core properties of the TTPs that supported productive creative appropriation. This work adds to others on technology probes by emphasizing their generative value in goal-oriented somatic explorations and practices, like our training course.

Peace: Projecting Dual-Identities on Interactive Furniture

Pictorial, 13 min talk; also presented as a Demo

  • Sara Nabil (Newcastle University, UK & Carleton University, Canada)
  • Richard MacLeod (Fouadisms Ltd, UK)

The world is promoting inclusion and diversity more than ever before. Many people have dual-identities that they alternate between and may often blend. In our design research we explore everyday objects and the role of technology to accommodate people’s needs and personalities. Can furniture change its shape to reflect our dual-identities? Can our interior spaces reveal their hidden aesthetics when interacting with us? We designed a set of matching interactive furniture to unfold these narratives. Our Peace Table and Peace Painting change colour with proximity to reflect the dual identity of Western-Muslims. This pictorial describes our design concept and process with the aim of encouraging the HCI community to design for experiential artwork. Such interactivity can enrich and add new dimensions to the quality of living experience by merging technology into home decor in calm, ubiquitous and non-intrusive ways.

Encasing Computation: Using Digital Fabrication Approaches to Make Microcontrollers Wearable

Pictorial, 13 min talk

  • Kate Hartman (OCAD University, Canada)
  • Chris Luginbuhl (OCAD University, Canada)
  • Yiyi Shao (OCAD University, Canada)
  • Ricardo Toller Correia (Federal University of Rio Grande do Sul (UFRGS), Brazil & OCAD University, Canada)

This paper introduces initial attempts to bridge the worlds of digital fabrication and do-it-yourself wearable electronics. It introduces a selection of microcontrollers that are anticipated to work well in a wearable context and provides an overview of five prototypes: Folding Felt Photon, Photon Sleepers, Circuit Playground Aurora Hat, Feather Belt, and Feather Shoes. These prototypes use laser cutting and/or 3D printing to produce microcontroller enclosures that can be worn on the body. Rigid and flexible materials are used alone and in combination to achieve qualities such as conformability, comfort, and device protection. Digital fabrication techniques facilitate rapid and repeatable production of prototypes for testing while allowing precise modification of fit, material thickness and machine settings. The intent is to demonstrate this approach and to share initial designs for digitally fabricated encasements that allow researchers, designers, and artists to better integrate small computational systems into clothing or other wearables.

Designing a Wearable Soft-Robotic Support System: A Body-Centered Approach

Pictorial, 13 min talk

  • Rahel Flechtner (University of the Arts Berlin, Germany)
  • Katharina Lorenz (Design Research Lab at the University of the Arts, Germany)
  • Geschke Joost (University of the Arts, Germany)

Given an extension of the working life and the large number of musculoskeletal disorders associated with occupational activities, wearable assistive technology could help to enable workers to carry out their profession as long as possible and necessary. In this pictorial we describe our design process towards a wearable soft robotic orthosis and illustrate a body-centered design approach that involves the human body throughout the different stages of the project and takes advantage of its abilities to specifically address the challenges in the development of wearable technology.

Level 0 Room 405

Meal and drinks

Level 0 Foyer

Please note that the afternoon program continues at the University of Sydney.

Also, TEI Steering Committee lunch held at this time.

NURBSforms: A Modular Shape-Changing Interface for Prototyping Curved Surfaces

Demo based on full paper

  • Yasaman Tahouni (MIT CSAIL, USA)
  • Isabel P. S. Qamar (MIT CSAIL, USA)
  • Stefanie Mueller (MIT CSAIL, USA)

We present NURBSforms: a modular shape-changing interface for prototyping curved surfaces. Each NURBSform module represents an edge of variable curvature that, when joined together with other modules, enables designers to construct surfaces and adjust their curvature interactively. NURBSform modules vary their curvature using active and passive shape memory materials: an embedded shape memory alloy (SMA) wire increases the curvature when heated, while an elastic material recovers the flat shape when the SMA wire cools down. A hall effect sensor on each module allows users to vary the curvature by adjusting the distance of their hand. In addition, NURBSforms provides functions across multiple modules, such as ‘save’, ‘reset’, and ‘load’, to facilitate design exploration. Since each module is self-contained and individually controllable, NURBSform modules scale well and can be connected into large networks of curves representing various geometries. By giving examples of different NURBSforms assemblies, we demonstrate how the modularity of NURBSforms, together with its integrated computational support, enables designers to quickly explore different versions of a shape in a single integrated design process.

Fabricatable Machines: A Toolkit for building Digital Fabrication Machines

Demo based on full paper

  • Frikk H Fossdal (Western Norway University of Applied Sciences, Norway)
  • Jens Dyvik (Fellesverkstedet, Norway)
  • Jakob Anders Nilsson (Norlink Consulting, Norway)
  • Jon Nordby (Bitraf, Norway)
  • Torbjørn Nordvik Helgesen (Nordvik Manifestation, Norway)
  • Rogardt Heldal (Western Norway University of Applied Sciences, Norway)
  • Nadya Peek (University of Washington, USA)

Digital fabrication is changing the way we design and manufacture the objects around us. Digital fabrication machines enable mass-customisation. However, customising the machines themselves requires a high amount of expertise, which prevents even advanced users from taking part in the creation of bespoke fabrication tools. We present Fabricatable Machines, an open-source toolkit for designing custom fabrication machines. We designed a linear motion module, The Fabricatable Axis, that provides robust automated linear motion. The Fabricatable Axis can be resized, adjusted, and fabricated from different materials. Users can build machines by combining multiple axes. We optimised the design of the axis to be manufactured using a CNC mill, with few externally sourced parts. We observed users creating machines including portable milling machines, 3D printers, and pipe inspection robots using the Fabricatable Machines Toolkit.

Tangible Music Programming Blocks for Visually Impaired Children

Demo based on full paper

  • Alpay Sabuncuoğlu (Koç University, Turkey & Twin Science and Robotics Lab, Turkey)

Programming can benefit children on learning science, math, and creative thinking, and has become a part of the primary school curriculum. However, programming tools for visually impaired children are still scarce. We developed an affordable and accessible tangible music platform for visually impaired children that aims to teach the basics of programming through music creation. By ordering the tangible blocks in an algorithmic structure, the children can create a melody. The physical and conceptual design of the system was developed with the help of visually impaired developers. We conducted a user study with fourteen visually impaired middle school children to observe their interactions with the prototype. In this paper, we present our design, provide several TUI design considerations for students with low to zero sight, and discuss the results of our user study and future directions.

SPIN (Self-powered Paper Interfaces): Bridging Triboelectric Nanogenerator with Folding Paper Creases

Demo based on full paper

  • Chris Chen (Georgia Tech, USA)
  • David E Howard (Georgia Tech, USA)
  • Steven Zhang (Georgia Tech, USA)
  • Youngwook Do (Georgia Tech, USA)
  • Sienna Xin Sun (Georgia Tech, USA)
  • Tingyu Cheng (Human Computer Interaction Institute, USA)
  • Zhong Lin Wang (Georgia Tech, USA)
  • Gregory D Abowd (Georgia Tech, USA)
  • HyunJoo Oh (Georgia Tech, USA)

We present Self-powered Paper INterfaces (SPIN) combining folding paper creases with triboelectric nanogenerator (TENG). Embedding TENG into paper creases, we developed a design editor and set of fabrication techniques to create paper-based interfaces that power sensors and actuators. Our SPIN design editor enables users to design their own crease pattern by changing parameters, embed power generating modules into the design, estimate total power generation, and export the files. Then following the fabrication instructions, users can cut and crease materials, and assemble them to build their own interfaces. We employ repetitive push-and-pull based embodied interactions with the mechanism of paper creases and demonstrate four application examples that show new expressive possibilities applying different scales of embodied interactions.

Japanese Patterns as NFC Antennas for Interactive Urushi-ware

Demo based on full paper

  • Keita Saito (University of Tsukuba, Japan)
  • Takuto Nakamura (University of Tsukuba, Japan)
  • Kazushi Kamezawa (University of Tsukuba, Japan)
  • Ryo Ikeda (University of Tsukuba, Japan)
  • Yuki Hashimoto (University of Tsukuba, Japan)
  • Buntarou Shizuki (University of Tsukuba, Japan)

Urushi (Japanese lacquer) is a natural resin paint with electrical insulating capability. We focused on Japanese patterns on an urushi-ware that is a product coated urushi. To make urushi-wares interactive without losing elegance and beauty, we transformed Japanese patterns into near-field communication (NFC) antenna patterns. We developed three types of prototype antennas and confirmed their functionality. In addition, we developed an IC key tag and an interactive lunch box as example applications of interactive urushi-wares.

Transient Relics: Temporal Tangents to an Ancient Virtual Pilgrimage

Demo based on full paper; also presented as a talk

  • Angelo Fraietta (UNSW, Australia)

This paper examines creating a temporary virtual relic through the creation of an interactive soundscape in the context of a religious pilgrimage known as theStations of The Cross. The paper examines the history of the rite and its transformation from a physical pilgrimage to a virtual one. It examines the phenomenon of iconic relics, which in some case have a reckoned value equivalent to that of the physical objects they represent. Also, it examines both the conceptual and legal implications of embodying sound into tangible objects, resulting in their treatment as protected relics. Finally, it describes the creation of an artwork, whereby religious pilgrims manipulate interactive sonic balls that communicate with other networked sonic devices in an attempt to correlate metaphors of human behaviours—such as play, humiliation, and mobs—into a sonic relic of the historical narrative of Christ taunted by Roman soldiers.

ForceStamps: Fiducial Markers for Pressure-sensitive Touch Surfaces to Support Rapid Prototyping of Physical Control Interfaces

Demo based on full paper; also presented as a talk

  • Changyo Han (University of Tokyo, Japan)
  • Katsufumi Matsui (University of Tokyo, Japan)
  • Takeshi Naemura (University of Tokyo, Japan)

We present ForceStamps, fiducial markers for supporting rapid prototyping of physical control interfaces on pressure-sensitive touch surfaces. We investigate marker design options for supporting various physical controls, with focusing on creating dedicated footprints and maintaining the structural stability. ForceStamps can be persistently tracked on surfaces along with the force information and other attributes. Designers without knowledge of electronics can rapidly prototype physical controls by attaching mechanisms to ForceStamps, while manipulating the haptic feedback with buffer materials. The created control widgets can be spatially configured on the touch surface to make an interface layout. We showcase a variety of example controls created with ForceStamps. In addition, we report on our analysis of a two-day musical instrument design workshop to explore the affordances of ForceStamps for making novel instruments with diverse interaction designs.

BulkScreen: Saliency-Based Automatic Shape Representation of Digital Images with a Vertical Pin-Array Screen

Poster based on Work-in-Progress

  • Riku Arakawa (The University of Tokyo, Japan)
  • Yudai Kent Tanaka (The university of Tokyo, Japan)
  • Hiromu Kawarasaki (The University of Tokyo, Japan)
  • Kiyosu Maeda (The University of Tokyo, Japan)

Digital images appearing on displays in everyday activities (e.g., photos on a smartphone) are automatically and instantly rendered without manual intervention such that we can seamlessly appreciate them. In contrast, shape displays require manual designs of outputs upon actuation of input images to render 3D shapes. In this work, we aim to achieve automatic and on-the-spot actuation of digital images so that we can seamlessly see 3D physical images. To this end, we developed BulkScreen, an image projection system that can automatically render 3D shapes of input images on a vertical pin-array screen. Our approach is based on a deep-neural-network saliency estimation coupled with our post-processing algorithm. We believe this spontaneous actuation mechanism facilitates applications with shape displays such as real-time picture browsing and display advertisement, building on the benefit of representing physical shapes; tangibility.

Cushion Interface for Smart Home Control

Demo based on Work-in-Progress

  • Yuri Suzuki (Keio Univercity, Japan)
  • Kaho Kato (Keio Univercity, Japan)
  • Naomi Furui (Keio University, Japan)
  • Daisuke Sakamoto (Hokkaido University, Japan)
  • Yuta Sugiura (Keio University, Japan)

In this research, we present a cushion interface for operating smart home applications. We developed a gesture recognition system using convolutional neural networks and embedded the acceleration sensor arrays in the cushion cover. To evaluate the system, we conducted experiments and measured recognition accuracy.

“… and we are the creators!” Technologies as Creative Material

Demo based on Work-in-Progress

  • Sarah Matthews (University of Queensland, Australia)
  • Marie A Boden (University of Queensland, Australia)
  • Stephen Viller (University of Queensland, Australia)

Tangible embedded technology kits are increasingly being used in schools, often as a means of providing students a platform for problem solving and computational thinking. When they are incorporated in creative tasks such as open-ended design projects, embedded technologies take on the role of a design material – a medium for exploration, iteration and creation. This paper presents some early results of a video analysis of school children’s collaborative interactions with tangible, embedded technologies in an open-ended design task. We identify some of the difficulties students encounter and some of the practices they develop with these kits as they work to progress their designs. Our findings detail how children deal with the opacity of the system and how they use it as a springboard for imagination. Our study provides an opportunity to reflect on how technology kits currently resist becoming a design material.

ambienBeat: Wrist-worn Mobile Tactile Biofeedback for Heart Rate Rhythmic Regulation

Demo based on full paper; also presented as a talk

  • Kyung Yun Choi (MIT, USA)
  • Hiroshi Ishii (MIT, USA)

We present a wrist-worn mobile heart rate regulator — ambienBeat — which provides closed-loop biofeedback via tactile stimulus based on users’ heartbeat rate (HR). We applied the principle of physiological synchronization via touch to achieve our goal of effortless regulation of HR, which is tightly coupled with mental stress levels. ambienBeat provides various patterns of tactile stimuli, which mimics the feeling of a heartbeat pulse, to guide user’s HR to resonate with its rhythmic, tactile patterns. The strength and rhythmic patterns of tactile stimulation are controlled to a level below the cognitive threshold of an individual’s tactile sensitivity on their wrist so as to minimize task disturbance. Here we present an acoustically noise-less soft voice-coil actuator to render the ambient tactile stimulus and present the system and implementation process. We evaluated our system by comparing it to ambient auditory and visual guidance. Results from the user study shows that the tactile stimulation was effective in guiding user’s HR to resonate with ambienBeat to either calm or boost the heart rate using minimal cognitive load.

LiftTiles: Constructive Building Blocks for Prototyping Room-scale Shape-changing Interfaces

Demo based on full paper; also presented as a talk

  • Ryo Suzuki (University of Colorado Boulder, USA)
  • Ryosuke Nakayama (Keio University, Japan)
  • Dan Liu (University of Colorado Boulder, USA)
  • Yasuaki Kakehi (University of Tokyo, Japan)
  • Mark D Gross (University of Colorado Boulder, USA)
  • Daniel Leithinger (University of Colorado Boulder, USA)

Large-scale shape-changing interfaces have great potential, but creating such systems requires substantial time, cost, space, and efforts, which hinders the research community to explore interactions beyond the scale of human hands. We introduce modular inflatable actuators as building blocks for prototyping room-scale shape-changing interfaces. Each actuator can change its height from 15cm to 150cm, actuated and controlled by air pressure. Each unit is low-cost (8 USD), lightweight (10 kg), compact (15 cm), and robust, making it well-suited for prototyping room-scale shape transformations. Moreover, our modular and reconfigurable design allows researchers and designers to quickly construct different geometries and to explore various applications. This paper contributes to the design and implementation of highly extendable inflatable actuators, and demonstrates a range of scenarios that can leverage this modular building block.

ExpandFab: Fabricating Objects Expanding and Changing Shape with Heat

Demo based on full paper; also presented as a talk

  • Hiroki Kaimoto (Shonan-Fujisawa, Japan)
  • Junichi Yamaoka (University of Tokyo, Japan)
  • Satoshi Nakamaru (Keio University, Japan & the University of Tokyo, Japan)
  • Yoshihiro Kawahara (University of Tokyo, Japan)
  • Yasuaki Kakehi (University of Tokyo, Japan)

ExpandFab is a fabrication method for creating expanding objects using foam materials. The printed objects change their shape and volume, which is advantageous for reducing the printing time and transportation costs. For the fabrication of expanding objects, we investigated a basic principle of the expansion rate and developed materials by mixing a foam powder and elastic adhesive. Furthermore, we developed a fabrication method using the foam materials. A user can design expanded objects using our design software and sets the expansion areas on the surface. The software simulates and exports the 3d model into a three-dimensional (3D) printer. The 3D printer prints the expandable object by curing with ultraviolet light. Finally, the user heats the printed objects, and the objects expand to maximum approximately 2.7 times of their original size. ExpandFab allows users to prototype products that expand and morph into various shapes, such as objects changing from one shape to various shapes, and functional prototype with electronic components. In this paper, we describe the basic principle of this technique, implementation of the software and hardware, application examples, limitations and discussions, and future works.

Be active! Participatory Design of Accessible Movement-Based Games

Demo based on full paper; also presented as a talk

  • Georg Regal (AIT Austrian Institute of Technology GmbH, Austria)
  • David Sellitsch (AIT Austrian Institute of Technology, Austria)
  • Simone Kriglstein (AIT Austrian Institute of Technology, Austria & University of Vienna, Austria)
  • Simon Kollienz (AIT Austrian Institute of Technology, Austria)
  • Manfred Tscheligi (University of Salzburg & AIT, Austria)

Regular physical exercise is an essential factor for preventing chronic diseases. Activities to support physical education in schools have been increasingly used in recent years as a target to get young people interested in sports. However, for visually impaired students it is difficult to participate in traditional team sports which are widely played in physical education. To overcome this issue, we developed a design toolkit consisting of building blocks that enable visually impaired students to create and play their own movement-based games. To investigate different types of building blocks and their potential to create accessible movement-based games, we conducted two game design workshops with visually impaired students. The results show that our building blocks can successfully be used by visually impaired students to empower them to become creators of movement-based games that are both accessible and engaging. By making our design-process transparent, we further provide insights on how to implement a co-creation process in a school for visually impaired students.

Wearable Bits: Scaffolding Creativity with a Prototyping Toolkit for Wearable E-textiles

Demo based on full paper; also presented as a talk

  • Lee Jones (Carleton University, Canada)
  • Sara Nabil (Carleton University, Canada)
  • Amanda McLeod (Carleton University, Canada)
  • Audrey Girouard (Carleton University, Canada)

Smart garment and wearable e-textile prototypes are difficult to co-design because of the variety of expertise needed (garment design, sewing skills, hardware prototyping, and software programming). To help with this, we developed a toolkit for prototyping wearable e-textiles, named Wearable Bits, which enables co-design with non-expert users without demanding sewing, hardware or software skills. We developed a low-fidelity and medium-fidelity experience prototype of the toolkit and ran a series of workshops where non-expert users designed their own e-textile wearables using Wearable Bits. In this paper, we discuss the ideas they developed, their construction techniques, the roles individuals took on while building, and suggestions for future toolkits.

Embedding Conversational Agents into AR: Invisible or with a Realistic Human Body?

Demo based on full paper; also presented as a talk

  • Jens Reinhardt (HAW Hamburg, Germany)
  • Luca Hillen (HAW Hamburg, Germany)
  • Katrin Wolf (HAW Hamburg, Germany)

Currently, (invisible) smart speech assistants, such as Siri, Alexa, and Cortana, are used by a constantly growing number of people. Moreover, Augmented Reality (AR) glasses are predicted to become widespread consumer devices in the future. Hence, smart assistants can easily become common applications of AR glasses, which allows for giving the assistant a visual representation as an embodied agent. While previous research on embodied agents found a user preference for a humanoid appearance, research on the uncanny valley suggests that simply designed humanoids can be favored over hyper-realistic humanoid characters. In a user study, we compared agents of simple versus more realistic appearance (seen through AR glasses) versus an invisible state-of-the-art speech assistants (see Figure 1). Our results indicate that a more realistic visualization is preferred as it provides additional communication cues, such as eye contact and gaze, which seem to be key features when talking to a smart assistant. But if the situation requires visual attention, e.g., when being mobile or in a multitask situation, an invisible agent can be more appropriate as they do not distract the visual focus, which can be essential during AR experiences.

DIY Fabrication of High Performance Multi-Layered Flexible PCBs / SoftMod: A Soft Modular Plug-and-Play Kit for Prototyping Electronic Systems

Demo based on Work-in-Progress

  • Mannu Lambrichts (Expertise Center for Digital Media, Belgium)
  • Jose Maria Tijerina Munoz (Expertise Centre for Digital Media, Belgium)
  • Tom De Weyer (Expertise Centre for Digital Media, Hasselt Universy – tUL – Flanders Make, Belgium)
  • Raf Ramakers (Flanders Make – Expertise Centre for Digital Media, Belgium)

We present a novel DIY fabrication workflow for prototyping highly flexible circuit boards using a laser cutter. As our circuits consist of Kapton and copper, they are highly conductive and thus support high-frequency signals, such as I2C. Key to our approach is a laser machine that supports both a CO2 laser as well as a fiber laser to precisely process respectively Kapton and copper. We also show how the laser cutter can cure soldering paste to realize VIAs (Vertical Interconnect Access) and solder components. In contrast, previous approaches for prototyping flexible PCBs through laser cutting only considered CO2 lasers which can not process metals. Therefore these approaches mainly used ink-based conductors that have a significantly higher electrical resistance than copper.

SoftMod: A Soft Modular Plug-and-Play Kit for Prototyping Electronic Systems

Demo based on full paper; also presented as a talk

  • Mannu Lambrichts (Expertise Centre for Digital Media, Belgium)
  • Jose Maria Tijerina Munoz (Expertise Centre for Digital Media, Belgium)
  • Raf Ramakers (Hasselt University, Belgium)

We present SoftMod, a novel modular electronics kit consisting of soft and flexible modules that snap together. Unlike existing modular kits, SoftMod tracks the topology of interconnected modules and supports basic plug-and-play behavior as well as advanced user-specified behavior. As such, the shape of a SoftMod assembly does not depend on the desired behavior and various 2D and 3D electronic systems can be realized. While the plug-and-play nature of our modules stimulates play, the advanced features for specifying behavior and for making a variety of soft and flexible shapes, offer a high-ceiling when experimenting with novel types of interfaces, such as wearables, and interactive skin and textiles.

Peace: Projecting Dual-Identities on Interactive Furniture

Demo based on Pictorial

  • Sara Nabil (Newcastle University, UK & Carleton University, Canada)
  • Richard MacLeod (Fouadisms Ltd, UK)

The world is promoting inclusion and diversity more than ever before. Many people have dual-identities that they alternate between and may often blend. In our design research we explore everyday objects and the role of technology to accommodate people’s needs and personalities. Can furniture change its shape to reflect our dual-identities? Can our interior spaces reveal their hidden aesthetics when interacting with us? We designed a set of matching interactive furniture to unfold these narratives. Our Peace Table and Peace Painting change colour with proximity to reflect the dual identity of Western-Muslims. This pictorial describes our design concept and process with the aim of encouraging the HCI community to design for experiential artwork. Such interactivity can enrich and add new dimensions to the quality of living experience by merging technology into home decor in calm, ubiquitous and non-intrusive ways.

CodeAttach: Engaging Children in Computational Thinking through Physical Play Activities

Poster based on Work-in-Progress

  • Junnan Yu (University of Colorado Boulder, USA)
  • Clement Zheng (University of Colorado, Boulder, USA)
  • Mariana Aki Tamashiro (University of Colorado Boulder, USA)
  • Christopher Gonzalez-millan (University of Colorado Boulder, USA)
  • Ricarose Roque (University of Colorado Boulder, USA)

Many toys and kits have been developed to help cultivate computational thinking in young children in the past decade. However, a majority of these kits ask children to move a robot/character around a limited space, constraining what could otherwise be generative and creative learning experiences into pre-defined activities and challenges with uniform outcomes. How can we expand what children can program and how they can create code? In this work, we present CodeAttach, a learning kit designed to engage young children in computational thinking through physical play activities. CodeAttach consists of three parts: (1) an interactive hardware device, (2) a mobile application to program the device, and (3) supporting materials for different play activities. With CodeAttach, children can attach the device to the supporting materials or other everyday objects to create their own props for physical play. The device offers different inputs and outputs and supports children to change the rules of existing physical activities or create new activities by programming this device. We outline the principles guiding the design of CodeAttach, its initial development process, and insights from early playtest with young kids and expert researchers.

BulkScreen: Saliency-Based Automatic Shape Representation of Digital Images with a Vertical Pin-Array Screen

Poster based on Work-in-Progress

  • Riku Arakawa (The University of Tokyo, Japan)
  • Yudai Kent Tanaka (The university of Tokyo, Japan)
  • Hiromu Kawarasaki (The University of Tokyo, Japan)
  • Kiyosu Maeda (The University of Tokyo, Japan)

Digital images appearing on displays in everyday activities (e.g., photos on a smartphone) are automatically and instantly rendered without manual intervention such that we can seamlessly appreciate them. In contrast, shape displays require manual designs of outputs upon actuation of input images to render 3D shapes. In this work, we aim to achieve automatic and on-the-spot actuation of digital images so that we can seamlessly see 3D physical images. To this end, we developed BulkScreen, an image projection system that can automatically render 3D shapes of input images on a vertical pin-array screen. Our approach is based on a deep-neural-network saliency estimation coupled with our post-processing algorithm. We believe this spontaneous actuation mechanism facilitates applications with shape displays such as real-time picture browsing and display advertisement, building on the benefit of representing physical shapes; tangibility.

Everyday Life Reflection: Exploring Media Interaction with Balance, Cogito & Dott

Demo based on full paper; also presented as a talk

  • Ine Mols (Eindhoven University of Technology, Netherlands & University of Technology Sydney, Australia)
  • Elise van den Hoven (University of Technology Sydney, Australia & Eindhoven University of Technology, Netherlands)
  • Berry Eggen (Eindhoven University of Technology, Netherlands)

Reflection is of increasing interest in HCI as it has many potential benefits in design, education and everyday life. In this paper, we explore media-supported reflection through the design and deployment of three concepts. In contrast to prevalent reflective approaches that are based on system-collected data, we explore how user-created media can support personal reflection. Three interactive prototypes were developed, focusing on different modalities: Balance uses audio, Cogito uses text, and Dott uses visual media. We evaluate these concepts in an in-thewild study that is both explorative and comparative. We found that the open-ended systems primarily supported reflection during the creation of media and that the use depended on opportunity and triggers. We conclude the paper with a discussion of our findings regarding the method and the implications of our findings for the broader area of design for reflection.

Designed Interactive Toys for Children with Cerebral Palsy

Poster based on Work-in-Progress

  • Yixuan Bian (Xi’an Jiao Tong Liverpool University, China)
  • Xiaoyu Wang (Xi’an Jiaotong Liverpool University, China)
  • Dongchen Han (Xi’an Jiaotong-Liverpool University, China)
  • Jie Sun (Xi’an Jiaotong-Liverpool University, China)

Children with cerebral palsy (CP) need to go through intensive rehabilitation exercises to develop and enhance their fine motor control in daily living. However, most of them cannot persist the regular repetitive exercise session using traditional tools for a long time. To provide a playful and attractive rehabilitation environment, toys are introduced to motivate children for exercises. This study aims to develop diverse toy modules and combine with basic blocks from LEGO to support various hand and arm functional training. The joyful color, cartoon animals, visual and audio feedbacks are proposed to increase the modules’ attractiveness. Their interchangeable handles and knobs can support different levels of exercises, which improve the toy modules’ accessibility for children with CP. Preliminary user testing in the hospital suggests that toys are warmly welcomed, easy to manipulate and play. We plan to collect user performance data and track the toys’ long-term effect in rehabilitation training to aid therapists in evaluating individual recovery progress.

Haptic feedback in running: Is it possible for information transfer through electrical muscle signalling?

Demo based on Work-in-Progress

  • Kevin Lu (Industrial Design, Netherlands)
  • Aarnout Brombacher (Eindhoven University of Technology, Netherlands)

Designing Tangible Tools for the Creation of Personalized Visits by Museum Professionals

Poster based on Work-in-Progress

  • Stéphanie REY (Univ. Bordeaux, France & Berger-Levrault, France)
  • Nadine Couture (Estia Institute of Technology, France)
  • Celia Picard (University Toulouse, France)
  • Christophe Bortolaso (Berger-Levrault, France)
  • Mustapha Derras (Berger-Levrault, France)
  • Anke M. Brock (University Toulouse, France)

Museums aim at offering personalized visits to encourage visitors to visit more than once. Few approaches consider the specific skills of museum professionals when designing tools for this purpose. We conducted a three-step iterative and user-centered design process with 13 museum professionals from six museums. This analysis led us to a main finding: the most complicated task for museum professionals is to explore their design space, composed of all possible visitor profiles for which to create visits. We propose a visualization for this multidimensional design space and six potential interactions on this representation. In an exploration space, we classify them on two axes: the selection approach and the type of interface (GUI versus TUI). We analyze their benefits and limits and, based on a pilot study, we propose insights and questions for future design.

Build Your Own Hercules: Helping Visitors Personalize Their Museum Experience

Poster based on Work-in-Progress

  • Stéphanie REY (Univ. Bordeaux, France & Berger-Levrault, France)
  • Celia Picard (University Toulouse, France)
  • Yanis FATMI (University Toulouse, France)
  • Fanny Franco (University Toulouse, France)
  • Sarah Guilbert (University Toulouse, France)
  • Jérémy MANÉRÉ (University Toulouse, France)
  • Christophe Bortolaso (Berger-Levrault, France)
  • Mustapha Derras (Berger-Levrault, France)
  • Nadine Couture (Estia Institute of Technology, France)
  • Anke M. Brock (University Toulouse, France)

Museums compete with the entertainment industry to attract a large audience. One solution to make them more attractive is to personalize the visits according to visitors’ preferences. Following a user centered design approach with visitors and museum professionals, we designed and implemented Build Your Own Hercules. This tangible prototype helps groups of visitors or individuals choose a visit based on their characteristics and desires. A pilot study in the museum provided first insights about the ease of use, satisfaction and interest within visitor groups.

Self-Interfaces: Utilizing Real-Time Biofeedback in the Wild to Elicit Unconscious Behavior Change

Demo based on Work-in-Progress

  • Nava Haghighi (MIT, USA & MIT, USA)
  • Arvind Satyanarayan (MIT, USA)

Self-Interfaces are interfaces that intuitively communicate relevant subconscious physiological signals through biofeedback to give the user insight into their behavior and assist them in creating behavior change. The human heartbeat is a good example of an intuitive and relevant haptic biofeedback; does not distract and is only felt when the heart beats fast. In this work, we discuss the design and development of a wearable haptic Self-Interface for Electrodermal Activity (EDA). EDA is a covert physiological signal correlated with high and low arousal affective states. We will evaluate the effectiveness of the EDA Self-Interface based on its intuitiveness, its ability to generate useful insight, whether this insight leads to behavior change, and whether the user can develop an intuitive awareness of their EDA over time when the device is removed. We hope the findings from this study will help us establish a series of guidelines for development of other Self-Interfaces in the future.

“… and we are the creators!” Technologies as Creative Material

Demo based on Work-in-Progress

  • Sarah Matthews (University of Queensland, Australia)
  • Marie A Boden (University of Queensland, Australia)
  • Stephen Viller (University of Queensland, Australia)

Tangible embedded technology kits are increasingly being used in schools, often as a means of providing students a platform for problem solving and computational thinking. When they are incorporated in creative tasks such as open-ended design projects, embedded technologies take on the role of a design material – a medium for exploration, iteration and creation. This paper presents some early results of a video analysis of school children’s collaborative interactions with tangible, embedded technologies in an open-ended design task. We identify some of the difficulties students encounter and some of the practices they develop with these kits as they work to progress their designs. Our findings detail how children deal with the opacity of the system and how they use it as a springboard for imagination. Our study provides an opportunity to reflect on how technology kits currently resist becoming a design material.

Creating Mediated Touch Gestures with Vibrotactile Stimuli for Smart Phones

Demo based on Work-in-Progress

  • Qianhui Wei (Eindhoven University of Technology, Netherlands)
  • Min Li (NXP Semiconductors, Belgium & Eindhoven University of Technology, Netherlands)
  • Jun Hu (Eindhoven University of Technology, Netherlands)
  • Loe Feijs (Eindhoven University of Technology, Netherlands)

Mediated touch gestures are essential for delivering information in social networking. This study presents a method to create mediated touch gestures with vibrotactile stimuli for smart phones and explores the effectiveness of applying mediated touch gestures with vibrotactile stimuli when sending text and stickers in instant messaging applications. We developed a preliminary prototype to record vibration signals of touch gestures. The envelopes of the recorded signals are approximated by piecewise linear functions and then translated to MIDI parameters for generating vibrotactile stimuli. We applied mediated touch gestures in instant messaging applications as haptic icons. A user study showed that gesture traits and the contact time affected the sensation of haptic icons. The enhancement effect of touch gestures was influenced by the contact time.

Heart Waves: A Heart Rate Feedback System Using Water Sounds

Demo based on Work-in-Progress

  • Omid Ettehadi (OCAD University, Canada)
  • Lee Jones (Carleton University, Canada)
  • Kate Hartman (OCAD University, Canada)

Wearable devices of today help people track and monitor their biometric data such as heart rate. While the tracked data can help inform people of their health, many find that it adds unnecessary anxieties in the way the feedback is provided. In the case of college students, they spend most of their time in a stressful environment, leading them to an increase in the risk of mental health issues. To help with this issue, we present Heart Waves, an experimental ambient feedback system that tracks heart rate and uses water sound to provide feedback in a stressful work environment. Heart Waves uses the sound of falling water to create a relaxing atmosphere to help ease any stress they are going through. As the user’s heart rate goes up, the flow of water increases, and as their heart rate goes down, the flow rate of water decreases. The purpose of this project is to automatize the processing of heart rate data so that the user does not have to analyze the data and create an ambient feedback system that adjusts to their heart rate.

Toning: New Experience of Sharing Music Preference with Interactive Earphone in Public Space

Poster based on Work-in-Progress

  • Inkyung Choi (Hanyang University, Korea, Republic of)
  • Dain Kim (Hanyang University, Korea, Republic of)

The users in their 20s are curious about the feedback after sharing the music that they like, as much as the act of sharing itself. Moreover, they wish to share their musical taste with various people not only with their acquaintance, in everyday spaces and when on the move. In this paper, we present an interactive music sharing service called Toning, that makes users express their musical taste and share music with people in the same space through the most common wearable device, bluetooth earbuds. Our design process started with user survey and interview, aiming to understand their opinion about proper way of expressing music preference, and define design guidelines. Based on this, early concept design was conducted.

Haplós: Vibrotactile Somaesthetic Technology for Body Awareness

Demo based on Work-in-Progress

  • Diego S Maranan (University of the Philippines Open University, Philippines)
  • Jane Grant (University of Plymouth, UK)
  • John Matthias (University of Plymouth, UK)
  • Professor M Phillips (University of Plymouth, UK)
  • Sue Denham (Wotelsat UK, UK)

Inspired by somatic methodologies and neurophysiology, Haplos is a low-cost, wearable technology that applies vibrotactile patterns to the skin, can be incorporated in existing clothing and implements, and can be programmed and activated remotely. We review existing vibrotactile technologies and known uses of vibrotactile stimuli; describe the hardware, textile, and software components of Haplos; describe results from a quasi-experimental workshop to evaluate Haplos; and discuss future research and development directions.

The Prefix/Suffix Model: Data Extraction to Encourage Expressive Walking Movements through Sonification

Demo based on Work-in-Progress

  • Frank Feltham (RMIT University, Australia)
  • Lian Loke (University of Sydney, Australia)
  • James Curtis (RMIT University, Australia)

Interactive auditory feedback on physical movement activity can provide new insights into kinaesthetic awareness. Much existing work tends to emphasise corrective sonic feedback approaches to cyclic movements, either for enhancing or correcting faulty performance. Less explored is the application of aesthetic sonification for encouraging playful, creative expression of rhythmic actions such as walking. To aid the sonic interaction design process, some form of conceptual model of walking is required. We contribute a preliminary version of the Prefix/Suffix Extraction model, informed by previous work on gesture to sound-action chunks. By decomposing the footstep into a prefix-middle-suffix signal, we can control and explore various mappings of weight transfer through the articulation of the foot to sonic characteristics that may encourage the walker to play with their normal way of walking. A public installation of an interactive pressure-sensitive sound-generating surface acted as a proof-of-concept, with four different harmonic sound treatments resulting in noticeable variations in how members of the general public creatively engaged with their walking

Circus Science – Designing Responsive Flying Trapeze Performance Costumes

Demo based on Work-in-Progress

  • Patrick Roche (University of St. Thomas, USA)
  • Collin J Goldbach (University of St. Thomas, USA)
  • Jeffrey A Jalkio (University of St. Thomas, USA)
  • Alix Putman (University of St. Thomas, USA)
  • AnnMarie Thomas (University of St. Thomas, USA)

We have partnered with local trapeze and circus arts instructors to combine various commercially available, wearable electronic components in the design and construction of a high tech, responsive costume for circus performers. Implementing microcontroller-based electronics in performance costumes is a novel approach that offers a new platform for enhancing a traditional circus performance with the rapidly expanding field of consumer microcontrollers and wearable electronics. We discuss the inspiration, implementation, and design challenges that led to our specific decisions in development, and what direction we hope to take the development in the future.

Rolling Pixels: Robotic Steinmetz Solids for Creating Physical Animations

Poster based on Work-in-Progress

  • Yujin Lee (Korea Advanced Institute of Science and Technology, Korea, Republic of)
  • Myeongseong Kim (Korea Advanced Institute of Science and Technology, Korea, Republic of)
  • Hyunjung Kim (KAIST, Korea, Republic of)

This article introduces Rolling Pixels, that are essentially robotic Steinmetz solids, for constructing frame-by-frame physical animations. As a bicylinder-shaped Rolling Pixel rolls back and forth or left and right, the shape and color of the top view of the pixel changes repeatedly without using any additional shape- or color-changing techniques. Implemented using off-the-shelf products and technologies, Rolling Pixels are easy-to-build, reproducible, and customizable kinetic design material. We describe the design and implementation of the current prototype of Rolling Pixels. We also illustrate the potential of the Rolling Pixels as building blocks for physical animations through a set of simulated examples.

Incingarette: Blending Concepts and Crafting Animated Parables to Track Smoking

Demo based on Work-in-Progress

  • Kenny K. N. Chow (The Hong Kong Polytechnic University, China)

Previous work in HCI about personal informatics and behavior change suggests that representing data in intuitive metaphors and meaningful stories on glace-able displays should be considered to complement typical data visualization for daily user reflection and understanding. Informed by insights from social psychology, providing information regarding one’s behavior (i.e., feedback) should (1) link behavioral data to positively or negatively valued outcomes; (2) show changes in the outcomes over time; and (3) include measures for pursuing different outcomes. Grounded in metaphor and blending theories from embodied cognition, we suggest metaphorically mapping less intuitive behavior-outcome links with more direct cause-effect relations from seemingly unrelated yet familiar domains. A behavior and a comparable scenario are cognitively compressed into an animated parable”. This paper describes the theoretical framework and design guidelines, and reports the development of a blended concept, “incingarette” (cigarette and incinerator), and its prototype. The work-in-progress informs updates on design recommendations. “

WraPr: Spool-Based Fabrication for Object Creation and Modification

Demo based on Work-in-Progress

  • Joanne Leong (MIT, USA)
  • Jose Martinez (Massachusetts Institute of Technology, USA)
  • Florian Perteneder (Tokyo Institute of Technology, Japan)
  • Ken Nakagaki (MIT Media Lab, USA)
  • Hiroshi Ishii (MIT, USA)

We propose a novel fabrication method for 3D objects based on the principle of spooling. By wrapping off-the-shelf materials such as thread, ribbon, tape or wire onto a core structure, new objects can be created and existing objects can be augmented with desired aesthetic and functional qualities. Our system, WraPr, enables gesture-based modelling and controlled thread deposition. We outline and explore the design space for this approach. Various examples are fabricated to demonstrate the possibility to attain a range of physical and functional properties. The simplicity of the proposed method opens the grounds for a light-weight fabrication approach for the generation of new structures and the customization of existing objects using soft materials.

Metaphors for Embodied Interaction: Device, Robot, and Friend

Poster based on Work-in-Progress

  • Kim Jingoog (University of North Carolina, USA)

Prediction of Impulsive Input on Gamepad Using Force-Sensitive Resistor

Demo based on Work-in-Progress

  • Atsuya Munakata (Keio University, Japan)
  • Yuta Sugiura (Keio University, Japan)

In this paper, we propose a method to predict impulsive input on a gamepad. We use a force-sensitive resistor to observe pressure on the gamepad button, and prediction is achieved by simple filtering processes. To evaluate our method, we conducted a user study in which users were encouraged to make impulsive inputs. The results showed that the system predicted an ON event of the button 30.82 ms in advance on average and an OFF event 29.30 ms in advance. Prediction accuracy was 97.87% for predicting ON events and 81.74% for predicting OFF events.

Towards Tangible Interaction in Scraping Therapy

Demo based on Work-in-Progress

  • Casey Walker (Brigham Young University, USA)
  • Michael D Jones (Brigham Young University, USA)
  • Steven J. Orrock (Orrock/Mendenhall Sports Medicine and Physical Therapy, USA)
  • Karen Carter (Orrock/Mendenhall Sports Medicine and Physical Therapy, USA)

This work in progress explores tangible interaction in the context of scraping tools commonly used in physical Therapy. Physical therapy is a health profession which uses mechanical force or motion to improve a person’s mobility and physical motion. We began with an open-ended research through design process that resulted in a scraping tool that gives the PT feedback on applied force using embedded pressure sensors and an ambient light display. At the conference, we will display a prototype interactive scraping tool and several other design artifacts. We invite participants to use the tool on a human calf simulant made from ballistics gel. We hope our work-in-progress will encourage new work exploring tangible interaction and physical therapy.

TMOVE: Multimodal Feedback Actuator for Non-visual Exploration of Virtual Lines

Demo based on Work-in-Progress

  • Pranjal Protim Borah (Indian Institute of Technology (IIT) Guwahati, India)
  • Ayaskant Panigrahi (Indian Institute of Technology (IIT) Guwahati, India & PDPM Indian Institute of Information Technology, Design and Manufacturing Jabalpur, India)
  • Keyur Sorathia (Indian Institute of Technology (IIT) Guwahati, India)

Students with visual impairment and blindness learn 3D shapes using physical models, which pose portability issues and high associated costs. Although tactile and kinesthetic feedback systems have been proposed for non-visual exploration of 3D virtual objects, such systems tend to suffer from similar shortcomings. In this research, we propose TMOVE, a low-cost handheld tactile feedback actuator to provide tactile, vibrotactile, and combined feedback for the exploration of virtual space. We conducted a preliminary study with 10 blindfolded sighted participants to compare the perceived 3D experience and the effectiveness of using the feedback modalities in the exploration of two coplanar virtual line segments. We found that all the feedback modalities are equally effective in the exploration of virtual space, and multimodal feedback offers an enhanced 3D perception of virtual objects. We believe the findings presented in this paper will be helpful for designers and researchers in developing low-cost tactile feedback systems.

Level 3 Homebase Studio, Wilkinson Building, University of Sydney campus.

Drinks and snacks.

Foyer outside Level 3 Homebase Studio, Wilkinson Building, University of Sydney campus.

Continuation of early afternoon session.

Level 3 Homebase Studio, Wilkinson Building, University of Sydney campus.

never odd or even exhibition poster art

Expect loads of fun with a special TEI cocktail, performances, DJ and fabulous food. Dress to impress with the TEI2020 conference logo colours of red, orange, pink and yellow!

About the Art Exhibition

Curation by Karen Cochrane, Thecla Schiphorst and Deborah Turnbull Tillman.

This year’s Art Exhibition for Tangible, Embedded and Embodied Interactions is inspired by a hopeful co-mingling between art and technology with the site being our future bodies, our future selves. The idea of a palindrome, a phrase that is the same spelled forwards and backwards, nods to the idea of meeting in the middle. Transhumanism, cyborgs, engineering experiments, and designed interactive engagements have existed in past science fiction, film and literature, often pointing to a dystopic future. What we aim to present here is a collection of works by experimental artists and performers who congregate in the present but draw on ideas of the future through aesthetics of the past.

What emerges is a landscape encoded with two-way messages and filled with two-way journeys. Visitors to the gallery can lose themselves in an infinity mirror, read chalk messages from a robot, transmit portraits of themselves, travel into paintings through virtual reality, transport their mindsets with scent, a pulsating heart or projected aura, rock in tandem with a hidden collaborator and dance with distributed and autonomous robotic roller skates.

Join us in this futuristic blast from the past, where the possibilities are endless and optimistic, but not quite polished. It is a future that is Never odd or eveN.

We have the exhibition catalogue available for download.

See photos of the opening event on the Arts Track page.

Level 2 Tin Sheds Gallery and Hearth, University of Sydney.


Wednesday 12 February 2020

All activities on Wednesday take place at UTS Building 11.

Session chair: Martin Kaltenbrunner (UFG Linz, Austria)

Be active! Participatory Design of Accessible Movement-Based Games

Full paper, 10 min talk; also presented as a Demo

  • Georg Regal (AIT Austrian Institute of Technology GmbH, Austria)
  • David Sellitsch (AIT Austrian Institute of Technology, Austria)
  • Simone Kriglstein (AIT Austrian Institute of Technology, Austria & University of Vienna, Austria)
  • Simon Kollienz (AIT Austrian Institute of Technology, Austria)
  • Manfred Tscheligi (University of Salzburg & AIT, Austria)

Regular physical exercise is an essential factor for preventing chronic diseases. Activities to support physical education in schools have been increasingly used in recent years as a target to get young people interested in sports. However, for visually impaired students it is difficult to participate in traditional team sports which are widely played in physical education. To overcome this issue, we developed a design toolkit consisting of building blocks that enable visually impaired students to create and play their own movement-based games. To investigate different types of building blocks and their potential to create accessible movement-based games, we conducted two game design workshops with visually impaired students. The results show that our building blocks can successfully be used by visually impaired students to empower them to become creators of movement-based games that are both accessible and engaging. By making our design-process transparent, we further provide insights on how to implement a co-creation process in a school for visually impaired students.

Flying LEGO Bricks: Observations of Children Constructing and Playing with Programmable Matter

Full paper, 20 min talk

  • Calvin Rubens (Queen’s University, Canada)
  • Sean Braley (Queen’s University, Canada)
  • Julie Torpegaard (Aalborg University, Denmark)
  • Nicklas Lind (Aalborg University, Denmark)
  • Timothy Merritt (Aalborg University, Denmark)
  • Roel Vertegaal (Queen’s University, Canada)

In this paper, we present a case study that explores how children could learn to interact with programmable matter. Flying drone swarms enable physical visualizations of complex data and simulation of physical objects and processes, e.g., planetary movements. The swarms can be digitally controlled as an ensemble as a form of (sparse) programmable matter.” We worked with the toy company LEGO®, to design and evaluate a “build and fly” experience with 240 children in a public exhibition. The children decorated a bendable handheld controller with LEGO® bricks and then used this controller to animate the flight of a 10-drone swarm. Results indicate that children enjoyed the constructive play and performance aspects of the system. Four main patterns of player behavior emerged, which we discuss in relation to possible improvements to the system. We provide implications for design of programmable matter systems for supporting child play experiences. “

Towards Designing Bodily Integrated Play

Full paper, 20 min talk

  • Florian Floyd Mueller (RMIT, Australia)
  • Jonathan Marquez (RMIT, Australia)
  • Tuomas Kari (RMIT, Australia)
  • Yan Wang (RMIT, Australia)
  • Yash Dhanpal Mehta (RMIT, Australia)
  • Zhuying Li (RMIT, Australia)
  • Rakesh Patibanda (RMIT, Australia)
  • Josh Andres (RMIT, Australia)

There is an increasing trend in utilizing interactive technology for bodily integrations, such as additional limbs and ingestibles. Prior work on bodily integrated systems mostly examined them from a productivity perspective. In this article, we suggest examining this trend also from an experiential, playful perspective, as we believe that these systems offer novel opportunities to engage the human body through play. Hence, we propose that there is an opportunity to design bodily integrated play”. By relating to our own and other’s work, we present an initial set of design strategies for bodily integrated play, aiming to inform designers on how they can engage with such systems to facilitate playful experiences, so that ultimately, people will profit from bodily play’s many physical and mental wellbeing benefits even in a future where machine and human converge. “

Tangible Play and Children with ASD in Low-Resource Countries: A Case Study

Full paper, 10 min talk

  • Amani I Soysa (Swinburne University of Technology, Australia)
  • Abdullah Al Mahmud (Swinburne University of Technology, Australia)

Tangible User Interfaces (TUIs) can bridge real-world physical objects with the digital world, which is beneficial for children with ASD. However, at present, most TUIs have been developed for children in affluent countries. Hence, this paper presents the evaluation of a TUI designed to support children with ASD in a low-resource country. The main objective of this study is to explore the initial usability of the proposed prototype among children with ASD and identify potential improvements to enhance the usability of the intervention. The preliminary evaluations were conducted with 20 Sri Lankan children with ASD and their special education teachers. This study identified four lessons for designing TUI such as including audio prompts to enhance tangible interactions, designing the structure of the tangibles to avoid fingers touching the iPad, add appropriate helper cues in graphical interchange format, and avoid having multiple tangibles with similar properties. Findings of this study lead to several design guidelines for developing affordable TUIs for children with ASD.

Social Movements: A Case Study in Dramaturgically-Driven Sound Design for Contemporary Dance Performance to Mediate Human-Human Interaction

Full paper, 10 min talk

  • Ryan K Ingebritsen (University of Illinois at Urbana-Champaign, USA)
  • Christopher B Knowlton (Rush University Medical Center, USA)
  • Hugh Sato (Northbrook Public Library, USA)
  • Erica Mott (School of the Art Institute of Chicago, USA)

We present the iterative design and final implementation of a real-time full-body control system of the sound score by performers in an immersive contemporary dance performance. Digital musical instruments for dance require different considerations than for music, particularly in contemporary, non-proscenium and participatory audience contexts. Arising from dramaturgical research around social movement and civic participation in the digital age, this case study presents a fusion of design choices that consider artistic themes, performer interaction and audience experience. Three generations of sound and technical design are presented, with intermittent performances, challenges and learnings at each stage. We believe that the collaborative evolution of compositional choices, computer vision techniques, sound mapping, performer interaction and staging reveals important insights into developing interaction systems for kinesthetic empathy. Rather than placing the focus on human-computer interaction, our final production employed technology as a bridge to encourage human-human interaction around themes of occupation, resistance and resilience.

Towards Experiencing Eating as Play

Full paper, 20 min talk

  • Florian Floyd Mueller (RMIT, Australia)
  • Tuomas Kari (RMIT, Australia)
  • Peter Arnold (RMIT, Australia)
  • Rohit Ashok Khot (RMIT, Australia)
  • Yan Wang (RMIT, Australia)
  • Yash Dhanpal Mehta (RMIT, Australia)
  • Zhuying Li (RMIT, Australia)
  • Jonathan Marquez (RMIT, Australia)

There is an increasing trend in HCI to combine eating and technology. We highlight the potential of interactive technology to support an experiential perspective on eating, in particular, how interactive technology can support experiencing eating as play. To understanding this, we reflect on four playful interactive eating systems we designed and two other works to articulate five strategies: make eating challenging, break cultural norms, design across eating stages, reduce eating autonomy, and playfully extend the social aspect. For each, we also include practical implementation options to provide designers with initial guidance on how they can begin to support experiencing eating as play. Ultimately, with our work, we aim to facilitate a future where eating is more playful.

Level 0 Room 405

Drinks and snacks

Level 0 Foyer

Session chair: Rohit Ashok Khot (RMIT, Australia)

MuscleSense: Exploring Weight Sensing using Wearable Surface Electromyography (sEMG)

Full paper, 10 min talk

  • Chin Guan Lim (National Taiwan University, Taiwan & National Taiwan University, Taiwan)
  • Chin Yi Tsai (National Taiwan University, Taiwan & National Taiwan University, Taiwan)
  • Mike Y. Chen (National Taiwan University, Taiwan & National Taiwan University, Taiwan)

Strength training improves overall health, well-being, physical appearance, and sports performance.There are four major factors that affect training efficacy in a training session: exercise type, number of repetitions, movement velocity, and workload. Prior research has used wearable sensors to detect exercise type, number of repetitions, and movement velocity while training. However, detecting workload remains constrained to instrumented exercise equipment, such as smart exercise machines or RFID-tagged free weights.This paper presents MuscleSense, an approach that estimates exercise workload by using wearable Surface Electromyography (sEMG) sensors and regression analysis. We evaluated the accuracy of several regression models and the effects of sensor placement through a 20-person user study. Results showed that MuscleSense achieved an accuracy of 0.68kg (root mean square error, RMSE) in sensing workload using both forearm and arm sensors and support vector regression (SVR).

AromaCue – A Scent Toolkit To Cope with Stress using the 4-7-8 Breathing Method

Full paper, 10 min talk

  • Zilan Lin (Keio University, Japan)
  • Kai Kunze (Keio University, Japan)
  • Atsuro Ueki (Keio University, Japan)
  • Masa Inakage (Keio University, Japan)

In this paper, we present AromaCue, an initial design for a scent-based toolkit to cope with stressful situations using scent conditioning. The AromaCue toolkit consists of two parts: a breath training device using multiple stimuli and a wearable scent-emitting device (with a stress ball as an activator). Scent can trigger emotional memories. In the initial design, we utilized the properties of various scents as retrieval cues for consciousness breathing. In a comparative experiment, eight participants showed a significant heart rate decrease several minutes after a stressor (the Stroop test) when a scent cue was present, but not without it. One week of user study revealed significant improvement on Depression Anxiety and Stress Scale (DASS-21) scores. Therefore, AromaCue can help users to cope with stress.

ForceStamps: Fiducial Markers for Pressure-sensitive Touch Surfaces to Support Rapid Prototyping of Physical Control Interfaces

Full paper, 10 min talk; also presented as a Demo

  • Changyo Han (University of Tokyo, Japan)
  • Katsufumi Matsui (University of Tokyo, Japan)
  • Takeshi Naemura (University of Tokyo, Japan)

We present ForceStamps, fiducial markers for supporting rapid prototyping of physical control interfaces on pressure-sensitive touch surfaces. We investigate marker design options for supporting various physical controls, with focusing on creating dedicated footprints and maintaining the structural stability. ForceStamps can be persistently tracked on surfaces along with the force information and other attributes. Designers without knowledge of electronics can rapidly prototype physical controls by attaching mechanisms to ForceStamps, while manipulating the haptic feedback with buffer materials. The created control widgets can be spatially configured on the touch surface to make an interface layout. We showcase a variety of example controls created with ForceStamps. In addition, we report on our analysis of a two-day musical instrument design workshop to explore the affordances of ForceStamps for making novel instruments with diverse interaction designs.

SoftMod: A Soft Modular Plug-and-Play Kit for Prototyping Electronic Systems

Full paper, 10 min talk; also presented as a Demo

  • Mannu Lambrichts (Expertise Centre for Digital Media, Belgium)
  • Jose Maria Tijerina Munoz (Expertise Centre for Digital Media, Belgium)
  • Raf Ramakers (Hasselt University, Belgium)

We present SoftMod, a novel modular electronics kit consisting of soft and flexible modules that snap together. Unlike existing modular kits, SoftMod tracks the topology of interconnected modules and supports basic plug-and-play behavior as well as advanced user-specified behavior. As such, the shape of a SoftMod assembly does not depend on the desired behavior and various 2D and 3D electronic systems can be realized. While the plug-and-play nature of our modules stimulates play, the advanced features for specifying behavior and for making a variety of soft and flexible shapes, offer a high-ceiling when experimenting with novel types of interfaces, such as wearables, and interactive skin and textiles.

Embedding Conversational Agents into AR: Invisible or with a Realistic Human Body?

Full paper, 10 min talk; also presented as a Demo

  • Jens Reinhardt (HAW Hamburg, Germany)
  • Luca Hillen (HAW Hamburg, Germany)
  • Katrin Wolf (HAW Hamburg, Germany)

Currently, (invisible) smart speech assistants, such as Siri, Alexa, and Cortana, are used by a constantly growing number of people. Moreover, Augmented Reality (AR) glasses are predicted to become widespread consumer devices in the future. Hence, smart assistants can easily become common applications of AR glasses, which allows for giving the assistant a visual representation as an embodied agent. While previous research on embodied agents found a user preference for a humanoid appearance, research on the uncanny valley suggests that simply designed humanoids can be favored over hyper-realistic humanoid characters. In a user study, we compared agents of simple versus more realistic appearance (seen through AR glasses) versus an invisible state-of-the-art speech assistants (see Figure 1). Our results indicate that a more realistic visualization is preferred as it provides additional communication cues, such as eye contact and gaze, which seem to be key features when talking to a smart assistant. But if the situation requires visual attention, e.g., when being mobile or in a multitask situation, an invisible agent can be more appropriate as they do not distract the visual focus, which can be essential during AR experiences.

EGuide: Investigating different Visual Appearances and Guidance Techniques for Egocentric Guidance Visualizations

Full paper, 20 min talk

  • Maximilian Dürr (University of Konstanz, Germany)
  • Rebecca Weber (University of Konstanz, Germany)
  • Ulrike Pfeil (University of Konstanz, Germany)
  • Harald Reiterer (University of Konstanz, Germany)

Mid-air arm movements are important for various activities. However, common resources for their self-directed practice require practitioners to divide their focus between an external source (e.g., a video screen) and moving. Past research found benefits for egocentric guidance visualizations compared to common resources. However, there is limited evidence about how such visualizations should look and behave. EGuide supports the investigation of different egocentric visualizations for the guidance of mid-air arm movements. We compared two visual appearances for egocentric guidance visualizations that differ in their shape (look), and three guidance techniques that differ by how they guide a user (behavior). For visualizations with a continuously moving guidance technique, our results suggest a higher movement accuracy for a realistic than an abstract shape. For user experience and preference, our results suggest that visualizations with an abstract shape and a guidance technique that visualizes important postures should not pause at important postures.

CoDa: Collaborative Data Interpretation Through an Interactive Tangible Scatterplot

Full paper, 20 min talk

  • Annemiek Veldhuis (Eindhoven University of Technology, Netherlands)
  • Rong-Hao Liang (Eindhoven University of Technology, Netherlands)
  • Tilde Bekker (Eindhoven University of Technology, Netherlands)

Tangibles can model abstract structures. One educational subject where this can be utilized is instruction on data visualization inter- pretation. Data physicalizations, tangible representations of data, offer graspable handles for the users to manipulate data visualiza- tions directly so that they can better understand what information they hold. However, investigations on the applicability of interac- tive data physicalizations in educational settings are still sparse. In this paper, we explore how students reason with an interactive tangible scatterplot through a collaborative data interpretation tool, CoDa. We report the design, development, and the user experiences in an exploratory study where 11 students, in groups of 2 to 4, completed a data analysis task with CoDa. The qualitative results show insights in the process of data interpretation, how interaction with the tangibles influenced these data interpretations, how the system aided collaboration and, overall user experience. We believe the results and implications offer a step towards nurturing future educational applications on interactive data physicalizations.

Level 0 Room 405

Meal and drinks

Level 0 Foyer

Also, ACM SIGCHI volunteering information session (in Level 0 Room 405).

Session chair: Katrin Wolf (Hamburg University of Applied Sciences, Germany)

Erfahrung & Erlebnis: Understanding the Bodily Play Experience through German Lexicon

Full paper, 20 min talk

  • Florian Floyd Mueller (RMIT, Australia)
  • Zhuying Li (RMIT, Australia)
  • Rohit Ashok Khot (RMIT, Australia)
  • Rakesh Patibanda (RMIT, Australia)
  • Sebastiaan Pijnappel (RMIT, Australia)
  • Josh Andres (RMIT, Australia)
  • Louise Petersen Matjeka (RMIT, Australia)
  • Bob Jarvis (RMIT, Australia)
  • Jonathan Marquez (RMIT, Australia)
  • Yan Wang (RMIT, Australia)

Bodily play systems are becoming increasingly prevalent, with research aiming to understand the associated player experience. We argue that a more nuanced lexicon describing bodily play experience” can be beneficial to drive the field forward. We provide game designers with two German words to communicate two different aspects of experience: “Erfahrung”, referring to experience where one is actively engaged in and gains knowledge from; and “Erlebnis”, referring to a tacit experience often translated as “lived experience”. We use these words to articulate a suite of design strategies for bodily play experiences by referring to past design work. We conclude by discussing these two aspects of experience in conjunction with two previously established perspectives on the human body. We believe this more nuanced lexicon can provide a clearer understanding for designers about bodily play allowing them to guide players in gaining the many benefits from such experiences. “

Designing Ritual Artifacts for Technology-Mediated Relationship Transitions

Full paper, 20 min talk

  • Sara Klüber (University of Würzburg, Germany)
  • Diana Löffler (University of Siegen, Germany)
  • Marc Hassenzahl (University of Siegen, Germany)
  • Ilona Nord (University of Würzburg, Germany)
  • Jörn Hurtienne (University of Würzburg, Germany)

Rituals are ubiquitous but not commonplace, help people to make sense of their life, and cultivate personal or social meaning. Although secularization and digitalization impact the occurrence of formal rituals, the need for marking life’s transitions remains unchanged. New rituals emerge, such as marking relationship status by hanging love locks on bridges. Tangible technologies hold great potential for augmenting, changing, or enhancing ritual practices which often involve enactments and symbolic props. In this paper, we analyze individual stories of hanging love locks and derive six pointers for designing technology-mediated relationship transition rituals. We applied the pointers in the design of El Corazón, a tangible artifact for relationship transition rituals. The results of an evaluation with 20 sweethearts show that relationship rituals can be designed deliberately, that tangibles can shape ritual experiences and that technology-mediated rituals can provide people with new means of coping with relationship uncertainty.

Designing With Ritual Interaction: A Novel Approach to Compassion Cultivation Through a Buddhist-Inspired Interactive Artwork

Full paper, 20 min talk

  • Kristina Mah (University of Sydney, Australia & University of Sydney, Australia)
  • Lian Loke (University of Sydney, Australia & University of Sydney, Australia)
  • Luke Hespanhol (University of Sydney, Australia & The University of Sydney, Australia)

Interactive public interfaces are opportunities for designers to affect how people relate to one another. We believe that traditional ritual can inspire a novel approach to the design of digital experience in public space. Ritual has been shown to support social cohesion and we argue that it can be used as a design strategy to encourage the cultivation of qualities like compassion in public space. We created our own interactive artwork, Wish Happiness that was inspired by methods of compassion cultivation from Tibetan Mahayana Buddhism. Through a concept-driven design research approach, we outline the set of design strategies we employed to translate key principles of Buddhist ritual and practice into the secular setting of a festival. We observed that compassion-like qualities, a positive state of mind and a sense of social harmony, were produced through interaction with the system, providing encouragement for future research into ritual interaction for compassion cultivation.

Transient Relics: Temporal Tangents to an Ancient Virtual Pilgrimage

Full paper, 10 min talk; also presented as a Demo

  • Angelo Fraietta (UNSW, Australia)

This paper examines creating a temporary virtual relic through the creation of an interactive soundscape in the context of a religious pilgrimage known as theStations of The Cross. The paper examines the history of the rite and its transformation from a physical pilgrimage to a virtual one. It examines the phenomenon of iconic relics, which in some case have a reckoned value equivalent to that of the physical objects they represent. Also, it examines both the conceptual and legal implications of embodying sound into tangible objects, resulting in their treatment as protected relics. Finally, it describes the creation of an artwork, whereby religious pilgrims manipulate interactive sonic balls that communicate with other networked sonic devices in an attempt to correlate metaphors of human behaviours—such as play, humiliation, and mobs—into a sonic relic of the historical narrative of Christ taunted by Roman soldiers.

Tangible Interfaces and Interactions in Sci-Fi Movies: A Glimpse at the Possible Future of TUIs through Fictional Tangible Systems

Full paper, 10 min talk

  • Victor Cheung (Simon Fraser University, Canada)
  • Alissa N. Antle (Simon Fraser University, Canada)

Science-Fiction (Sci-Fi) movies have long been a frontier in showcasing futuristic computer interfaces and their associated interactions. Unconstrained by technological limitations, they are free to depict the most imaginative systems, including augmenting objects attributes that are not yet possible in reality. We present a case study on Sci-Fi movies where tangible objects are part of these systems, and examine how they illustrate Tangible User Interfaces (TUIs) concepts. We provide three examples of tangible systems and one that deviates considerably (holographic system), and analyze them using a well-established interaction model (MCRpd). We found that TUIs in movies exhibit various levels of the model’s characteristics and demonstrate an inclusive and diverse context through combining interaction modalities and catering to audience needs. We argue that these aspects provide valuable lessons and implications in designing future TUIs and hope to broaden the design space by initiating discussions on the fascinating worlds in Sci-Fi movies.

Level 0 Room 405

Drinks and snacks

Level 0 Foyer

Country Centred Design

Australian Aboriginal peoples are the oldest living; the longest continuing culture, within the driest continent on Earth. Ancient technology innovation developed by Aboriginal peoples, such as the boomerang, fish traps, spinifex resin, message sticks, fibre work, spears and woomeras, watercraft and stone tools, reveals underlying technology design and development methodologies that reflect a unified approach and value system. Aboriginal social cohesion, well-being, environmental sustainability, culture and spirituality underpins the foundation of such innovation and has been developed through systems of Indigenous governance, commonly understood as Lore (otherwise known as The Dreaming, Jukupurra, and akin to Law’). Culture has created the framework for this society and in turn, the myriad of science and technology developments, over millennia. It’s now critical timing that we reflect and initiate a new wave of technologies designed and developed through a Code of Ethics to embodies the principles of social and environmental sustainability: Caring for Country, Caring for Kin. This future of ethical technology would reflect systemic change required to address a fractured political, economic and social system by adopting Australian Aboriginal peoples Lore. Here within, we discuss why and how the development of a new Code of Ethics for technology development can be informed by Indigenous design principles and governance.

Angie Abdilla, CEO of Old Ways, New. This Sydney-based consultancy delivers service design and digital product development. The organisation taps into Indigenous Knowledge Systems by drawing upon tens of thousands of years of culture, research, iterative design and innovation of technology.

Level 0 Room 405