Program Overview

Sun, 18 March

All studios, as well as lunch/coffee breaks and registration on Sunday, will take place at Open Lab (Valhallavägen 79, 114 27 Stockholm).
The Graduate Student Consortium will take place in the KTH CSC Library (Lindstedtsvägen 3). Following registration at Open Lab, a student volunteer will take GSC participants to the venue. The Arts Track Performance will take place at KTH main campus in KTH R1 Experimental Performance Space (Drottning Kristinas väg 51) – the Reactor Hall. See the Venue page or the TEI 2018 Google Map for additional information.

  • 8.00 – Studio & GSC Registration

    Time: 8.00 – 9.00, Location: Open Lab

  • 9.00 – Studios

      Time: 9.00 – 6.00, Location: Open Lab

    • S1: Sociomateriality: Infrastructuring and Appropriation of Artifacts

      Tom Jenkins
      Vasiliki Tsaknaki
      Karey Helms
      Ludvig Elblaus
      Nicolai B. Hansen

      Novel materials and innovative applications can sometimes outweigh a reflective perspective on the roles that objects and materials can play in social life. In this Studio, we want to bring together researchers and practitioners who are interested in exploring design outcomes from a sociomaterial perspective. By having prototypes at the center of the Studio activities, we intend to create prompted speculative fictions that link the material outcomes of design practice to social agency and cultural effects. This studio will offer an opportunity to examine how objects might participate in social spheres as well as act as material bridges to their design process. We will do this through both hands-on examination of design objects, and inquiry into the infrastructuring and appropriation of these artifacts. The themes that will be examined are agency, material participation, and cultural performance of things. We encourage participants to bring their own prototypes.

    • S2: Designing eTextiles for the Body: Shape, Volume & Motion

      Rachel Freire
      Paul Strohmeier
      Cedric Honnet
      Jarrod Knibbe
      Sophia Brueckner

      Our clothing is not flat, but rather conforms and adapts to our bodies. In this hands-on workshop, participants will experiment and create 3D eTextile garments, while discussing the rich history, current state and possible future directions of wearables.
      Through garment construction and rapid prototyping, we will explore how to integrate eTextiles into volumetric, tailored garments that better conform to the shape of the human body, and better respond to its movements. We will show examples of connectors and sensors, and discuss the affordances and limitations of various textiles.
      A short masterclass will introduce a range of techniques for garment design and construction, ensuring the workshop is suitable for all skill levels. We will include a brief history of wearables and eTextiles, and an overview of recent innovations within HCI and fashion. We encourage people to bring existing projects and ideas, as well as their own materials and preferred microcontrollers.

    • S3: Deformable Controllers: Fabrication and Design to Promote Novel Hand Gestural Interaction Mechanisms

      Victor Cheung
      Alexander K. Eady
      Audrey Girouard

      Build a unique deformable controller that lets you bend, twist, and even stretch! In this studio you will learn to use deformation sensors as an organic and tangible means of human-computer interaction, and you will gain hands-on experience in using these sensors to build deformable controllers. The goal of this studio is to provide a platform to share resources and transfer knowledge of deformable sensors/materials, to foster creativity in using deformable devices as input controllers that harness the wide range of gestures that deformation enables, and to move beyond recreating touch/click inputs in deformable device interactions.

    • S4: Kinetic Body Extensions for Social Interactions

      Kate Hartman
      Boris Kourtoukov
      Erin Lewis

      How can we create wearable devices to amplify, extend, or subvert our existing body language? This studio focuses on in-person, “real life” social interactions. It invites participants to explore ways of physically extending their own expressivity via wearable electronics and structural textile design using a prototyping method developed by OCAD University’s Social Body Lab. Through the use of sensors, servo motors, and pleated and folded textile forms, participants will learn how to create kinetic wearable body extensions that expand and contract in response to intuitive body movements. We will experiment with kinetic structures that vary in size, form, complexity, and placement on the body and experiment with triggers including pressure, flexion, light, and muscle activity. Ranging from the subtle to the absurd, we will prototype new methods for extending and enhancing our physical social interactions.

    • S5: Internet of Tangibles: Exploring the Interaction-Attention Continuum

      Leonardo Angelini
      Elena Mugellini
      Omar Abou Khaled
      Nadine Couture
      Elise van den Hoven
      Saskia Bakker

      There is an increasing interest in the HCI research community to design richer user interactions with the Internet of Things (IoT). This studio will allow exploring the design of tangible interaction with the IoT, what we call Internet of Tangibles. In particular, we aim at investigating the full interaction-attention continuum, with the purpose of designing IoT tangible interfaces that can switch between peripheral interactions that do not disrupt everyday routines and focused interactions that support user’s reflections. This investigation will be conducted through hands-on activities where participants will prototype tangible IoT objects, starting by a paper prototyping phase, supported by design cards, and followed by an Arduino prototype phase. We encourage the participation to everyone interested in designing interactions for the Internet of Things, from both academy and industry and independently from her background. The studio will be also an opportunity for networking with other researchers working in the field.

    • S6: Tracking Well-Being – Design Explorations Through Object Theatre

      Andreas Heiberg Skouby
      Merja Ryöppy
      Robb Mitchell

      Explore and get acquainted with subjective well-being through Object Theatre For Design. Tracking well-being is an expanding field of research where the challenge is not only the technological readiness to record habits of e.g. sleep, activity and diet, but to go beyond numbers and capture users’ subjective well-being. In this workshop we welcome participants from a range of disciplines to engage hands-on with design and theatre, to capture user perceptions of well-being in tangible form. We introduce Object Theatre For Design to explore different techniques for relating to physical artifacts. Participants will explore how to become a medium for object interactions and express object functionalities by developing a new understanding of the artifact. We will bring tracking devices to capture activity data, and a set of carefully selected everyday objects and building materials to support the exploratory design process.

  • 9.00 – Graduate Student Consortium

    Time: 9.00 – 5.00, Location: KTH CSC Library

    Ellen Yi-Luen Do, Stephen Brewster, Ian Oakley

  • 18.00 & 21.00 – Arts Track Performance

    Time: 18.00 (first) & 21.00 (second), Location: Reactor Hall

    • Op 1254: Music for Neutrons, Networks and Solenoids using a Restored Organ in a Nuclear Reactor

      Leif Handberg, KTH Royal Institute of Technology
      Ludvig Elblaus, KTH Royal Institute of Technology
      Chris Chafe, Stanford University

      In this paper, an installation is presented that connects Stanford and Stockholm through a one-of-a-kind combination of instrument and venue: the Skandia Wurlitzer theatre organ (Wurlitzer serial no.1254) situated in the KTH R1 Experimental Performance Space, a disused nuclear reactor. A continuous stream of musical data, audio, and video between the two places explored the capabilities of the digital to play with the concept of presence and embodiment, virtuality and the physical. In the installation, a series of performances presented new pieces written especially for this setting. The pieces were performed by musicians in Stanford, mediated in real-time, allowing them to play together with the theatre organ in Stockholm, temporarily fusing the two venues to create one ensemble, one audience, in one space.

Mon, 19 March

All Monday program events will take place at Nymble at KTH Royal Institute of Technology.
For a map of individual rooms and accessibility access, please visit the Venue page. For directions, please visit the TEI 2018 Google Map.

  • 8.30 – Registration

    Time: 8.30 – 9.30, Location: Puben

  • 9.30 – Conference Opening

    Time: 9.30 – 10.30, Location: Main Hall

  • 10.00 – The Materiality of Interaction: Mikael Wiberg

    Time: 10.00 – 10.30, Location: Main Hall

    Mikael Wiberg

    A new approach to interaction design that moves beyond representation and metaphor to focus on the material manifestations of interaction. Smart watches, smart cars, the Internet of things, 3D printing: all signal a trend toward combining digital and analog materials in design. Interaction with these new hybrid forms is increasingly mediated through physical materials, and therefore interaction design is increasingly a material concern. In this book, Mikael Wiberg describes the shift in interaction design toward material interactions. He argues that the “material turn” in human-computer interaction has moved beyond a representation-driven paradigm, and he proposes “material-centered interaction design” as a new approach to interaction design and its materials. He calls for interaction design to abandon its narrow focus on what the computer can do and embrace a broader view of interaction design as a practice of imagining and designing interaction through material manifestations. A material-centered approach to interaction design enables a fundamental design method for working across digital, physical, and even immaterial materials in interaction design projects. Wiberg looks at the history of material configurations in computing and traces the shift from metaphors in the design of graphical user interfaces to materiality in tangible user interfaces. He examines interaction through a material lens; suggests a new method and foundation for interaction design that accepts the digital as a design material and focuses on interaction itself as the form being designed; considers design across substrates; introduces the idea of “interactive compositions”; and argues that the focus on materiality transcends any distinction between the physical and digital.

  • 10.30 – Coffee & Studio Posters

    Time: 10.30 – 11.00, Location: Puben

  • 11.00 – Session 1: Shape Changing and Moving Interfaces

      Time: 11.00 – 12.00, Location: Main Hall

      Tanja Döring

    • Working with an Autonomous Interface: Exploring the Output Space of an Interactive Desktop Lamp

      Diana Nowacka, UCL
      Katrin Wolf, Hamburg University of Applied Sciences
      Enrico Costanza, UCL Interaction Centre
      David Kirk, Northumbria University

      Increasing sophistication and ubiquity of digital devices is creating potential for the development of new kinds of actuated interfaces. In this paper, we explore the design space around movement as a form of gestural communication for information output, in simple actuated desktop devices. We were curious as to how people might envision interacting with autonomous technology in the office. Accordingly, we focused our attentions on one prevalent desktop object, an interactive lamp, with three actuated joints, which allowed us to explore the interaction space of such devices. We invited 13 participants to design and enact movements with the lamp to communicate 20 simple messages. We explored a subset of these generated gestures, using the lamp as a personal cueing device in an office setting with 14 new participants. We present our qualitative findings from both studies that let users imagine the usage of an interactive desktop lamp through actuation.

    • Morphology Extension Kit: A Modular Robotic Platform for Physically Reconfigurable Wearables

      Sang-won Leigh, MIT
      Timothy Denton, MIT Media Lab
      Kush Parekh, Rhode Island school of design
      William S Peebles, MIT Media Lab
      Magnus Johnson, MIT Media Lab
      Pattie Maes, MIT Media Lab

      Various forms of wearable robotics challenge the notion of the human body, in that the robots render the acquired capabilities in physical forms. However, majority of such systems are designed for specific purposes, where rapidly changing environments pose a diverse set of problems that are difficult to solve with a single interface. To address this, we propose a modular hardware platform that allows its users or designers to build and customize wearable robots. The process of building an augmentation is simply to connect actuator and sensor blocks and attach to the body. The current list of designed components includes servomotor modules and sensor modules, which can be programmed to incorporate additional electronics for desired sensing capabilities. Our electrical and mechanical connector designs can be extended to utilize any motors within afforded power, size, and weight constraints. We also show how our platform can be used in various applications, in addition to how the proposed design enables designers, and potentially machines, to generate alternative designs.

    • TwistBlocks: Pluggable and Twistable Modular TUI for Armature Interaction in 3D Design

      Meng Wang, Tsinghua University
      Kehua Lei, Tsinghua University
      Zhichun Li, Tsinghua University
      Haipeng Mi, Tsinghua University
      Yingqing Xu, Tsinghua University

      The use of armatures is a convenient way of deforming and animating 3D digital models. However, interact with an armature is usually time-consuming, and often requires professional skills. Tangible interfaces, such as building blocks, while having improved the accessibility of digital construction, are still lacking in flexibility and present difficulties in dealing with curved armatures. This paper introduces TwistBlocks, a pluggable and twistable modular TUI that improves the accessibility of 3D modeling and animating by physical armature interaction. TwistBlocks is capable of creating complex armatures with dense branches, and supports a high DOF (Degree of Freedom) in physical manipulation. In addition, a set of software tools are provided for novice users to easily create, rig, and animate models. The global-posture sensing network sensing scheme can also measure the rotation and movement of the physical armature, and enables interaction between multiple models.

  • 12.00 – Lunch

    Time: 12.00 – 13.30, Location: Matsalen

  • 13.30 – Session 2: Shape Changing Textiles and Interactive Materials

      Time: 13.30 – 15.00, Location: Main Hall

      Ian Oakley

    • HäirIÖ: Human Hair as Interactive Material

      Christine Dierk, UC Berkeley
      Sarah Sterman, UC Berkeley
      Molly Nicholas, UC Berkeley
      Eric Paulos, UC Berkeley

      Human hair is a cultural material, with a rich history displaying individuality, cultural expression and group identity. It is malleable in length, color and style, highly visible, and embedded in a range of personal and group interactions. As wearable technologies move ever closer to the body, and embodied interactions become more common and desirable, hair presents a unique and little-explored site for novel interactions. In this paper, we present an exploration and working prototype of hair as a site for novel interaction, leveraging its position as something both public and private, social and personal, malleable and permanent. We develop applications and interactions around this new material in HäirIÖ: a novel integration of hair-based technologies and braids that combine capacitive touch input and dynamic output through color and shape change. Finally, we evaluate this hair-based interactive technology with users, including the integration of HäirIÖ within the landscape of existing wearable and mobile technologies.

    • Integrating Textile Materials with Electronic Making: Creating New Tools and Practices

      Irene Posch, TU Wien
      Geraldine Fitzpatrick, TU Wien

      We introduce and discuss the design and use of new tools for electronic textile making. Electronic textiles, or eTextiles, are increasingly produced and used in experimental interfaces, wearables, interior design, as well as in the education and maker cultures. However, the field relies on tools specific to either the textile or the electronic domain, neglecting the distinct requirements, and potentials, of their intersection. To address this gap, we explored the design of new tools, targeted at specific needs and use cases of electronic textile making and the materials used. Three resulting prototypes have been evaluated through use in both our own practice and among a group of experts in the field. Our findings show the importance of specialized tools for routines essential to the field eTextiles, their role for the emergence of new practices, as well as for the understanding of the discipline.

    • ShapeTex: Implementing Shape-Changing Structures in Fabric for Wearable Actuation

      Jiachun Du, Eindhoven University of Technology
      Panos Markopoulos, Eindhoven University of Technology
      Qi Wang, Eindhoven University of Technology
      Marina Toeters, Eindhoven University of Technology
      Ting Gong, Ting Gong Studio

      Research in smart textiles and garments has mostly focused on integrating sensing technology. In order to make garments that are truly interactive it is also essential to develop technologies for actuating smart garments and textiles. This paper introduces ShapeTex, a thermal shape changing fabric that uses laminate thermal expansion to actuate textiles. We present the design process and rationale for ShapeTex; we explain the fabrication process we have developed for making ShapeTex accessible to fashion designers and interaction designers. Based on co-creation sessions with designers we discuss requirements derived from this material. Finally we present a number of concept prototypes created to explore and illustrate the potential applications of ShapeTex.

    • Beyond LED Status Lights – Design Requirements of Privacy Notices for Body-worn Cameras

      Marion Koelle, University of Oldenburg
      Katrin Wolf, Hamburg University of Applied Sciences
      Susanne Boll, University of Oldenburg

      Privacy notices aim to make users aware of personal data gathered and processed by a system. Body-worn cameras currently lack suitable design strategies for privacy notices that announce themselves and their actions to \textit{secondary} and \textit{incidental} users, such as bystanders, when they are being used in public. Hypothesizing that the commonly used status LED is not optimal for this use case, due to being not sufficiently understandable, noticeable, secure and trustworthy, we explore design requirements of privacy notices for body-worn cameras. Following a two-step approach, we contribute incentives for design alternatives to status LEDs: Starting from 8 design sessions with experts, we discuss 8 physical design artifacts, as well as design strategies and key motives. Finally, we derive design recommendations of the proposed solutions, which we back based on an evaluation with 12 UX \& HCI experts.

  • 15.00 – Coffee

    Time: 15.00 – 15.30, Location: Hyllan

  • 15.30 – Visions of the Now: Anna Lundh

    Time: 15.30 – 16.30, Location: Main Hall

    Anna Lundh

    In September 1966, the festival and congress Visioner av Nuet (“Visions of the Present”) took place in Stockholm. The event aimed to investigate the impact of technology on humanity, society and art. Computing was only in its infancy, yet discussing its consequences and exploring what it would mean to make use of it artistically was considered paramount at that time. Almost half a century later, artist Anna Lundh began a research effort into the 1966 festival and produced an updated version, to reconsider its original concerns in a world fully immersed in the technology that in 1966 was called “the new”. Lundh’s festival, Visions of the Now, took place in May 2013 and gathered over thirty international artists, musicians, theorists and scientists, in a series of lectures, panels, open discussions, art and music performances. This artistic experiment has also been documented in a multi-volume archive box, recently published by Sternberg Press.

    Anna Lundh is an artist and PhD candidate in the KTD program (Konstfack/Royal Institute of Technology). Lundh’s work investigates cultural phenomena, societal agreements, and how ideological shifts take place, often taking technology and our experience of time as points of departure. This transdisciplinary practice includes video, installation, web-based work, interactive experiments, text and performance. Her work has been exhibited in Sweden at Moderna Museet, Bonniers Konsthall, Tensta Konsthall, and GIBCA; and internationally in Norway, Holland, Denmark, Latvia and the US – primarily in New York City art organizations including The New Museum, The Kitchen, ExitArt, Apexart, and Performa.

  • 16.30 – Demo session 1: Textile, light, and shape changing interfaces

      Time: 16.30 – 18.00, Location: Hyllan

    • Working with an Autonomous Interface: Exploring the Output Space of an Interactive Desktop Lamp

      Diana Nowacka, UCL
      Katrin Wolf, Hamburg University of Applied Sciences
      Enrico Costanza, UCL Interaction Centre
      David Kirk, Northumbria University

      Increasing sophistication and ubiquity of digital devices is creating potential for the development of new kinds of actuated interfaces. In this paper, we explore the design space around movement as a form of gestural communication for information output, in simple actuated desktop devices. We were curious as to how people might envision interacting with autonomous technology in the office. Accordingly, we focused our attentions on one prevalent desktop object, an interactive lamp, with three actuated joints, which allowed us to explore the interaction space of such devices. We invited 13 participants to design and enact movements with the lamp to communicate 20 simple messages. We explored a subset of these generated gestures, using the lamp as a personal cueing device in an office setting with 14 new participants. We present our qualitative findings from both studies that let users imagine the usage of an interactive desktop lamp through actuation.

    • TwistBlocks: Pluggable and Twistable Modular TUI for Armature Interaction in 3D Design

      Meng Wang, Tsinghua University
      Kehua Lei, Tsinghua University
      Zhichun Li, Tsinghua University
      Haipeng Mi, Tsinghua University
      Yingqing Xu, Tsinghua University

      The use of armatures is a convenient way of deforming and animating 3D digital models. However, interact with an armature is usually time-consuming, and often requires professional skills. Tangible interfaces, such as building blocks, while having improved the accessibility of digital construction, are still lacking in flexibility and present difficulties in dealing with curved armatures. This paper introduces TwistBlocks, a pluggable and twistable modular TUI that improves the accessibility of 3D modeling and animating by physical armature interaction. TwistBlocks is capable of creating complex armatures with dense branches, and supports a high DOF (Degree of Freedom) in physical manipulation. In addition, a set of software tools are provided for novice users to easily create, rig, and animate models. The global-posture sensing network sensing scheme can also measure the rotation and movement of the physical armature, and enables interaction between multiple models.

    • HäirIÖ: Human Hair as Interactive Material

      Christine Dierk, UC Berkeley
      Sarah Sterman, UC Berkeley
      Molly Nicholas, UC Berkeley
      Eric Paulos, UC Berkeley

      Human hair is a cultural material, with a rich history displaying individuality, cultural expression and group identity. It is malleable in length, color and style, highly visible, and embedded in a range of personal and group interactions. As wearable technologies move ever closer to the body, and embodied interactions become more common and desirable, hair presents a unique and little-explored site for novel interactions. In this paper, we present an exploration and working prototype of hair as a site for novel interaction, leveraging its position as something both public and private, social and personal, malleable and permanent. We develop applications and interactions around this new material in HäirIÖ: a novel integration of hair-based technologies and braids that combine capacitive touch input and dynamic output through color and shape change. Finally, we evaluate this hair-based interactive technology with users, including the integration of HäirIÖ within the landscape of existing wearable and mobile technologies.

    • Integrating Textile Materials with Electronic Making: Creating New Tools and Practices

      Irene Posch, TU Wien
      Geraldine Fitzpatrick, TU Wien

      We introduce and discuss the design and use of new tools for electronic textile making. Electronic textiles, or eTextiles, are increasingly produced and used in experimental interfaces, wearables, interior design, as well as in the education and maker cultures. However, the field relies on tools specific to either the textile or the electronic domain, neglecting the distinct requirements, and potentials, of their intersection. To address this gap, we explored the design of new tools, targeted at specific needs and use cases of electronic textile making and the materials used. Three resulting prototypes have been evaluated through use in both our own practice and among a group of experts in the field. Our findings show the importance of specialized tools for routines essential to the field eTextiles, their role for the emergence of new practices, as well as for the understanding of the discipline.

    • ShapeTex: Implementing Shape-Changing Structures in Fabric for Wearable Actuation

      Jiachun Du, Eindhoven University of Technology
      Panos Markopoulos, Eindhoven University of Technology
      Qi Wang, Eindhoven University of Technology
      Marina Toeters, Eindhoven University of Technology
      Ting Gong, Ting Gong Studio

      Research in smart textiles and garments has mostly focused on integrating sensing technology. In order to make garments that are truly interactive it is also essential to develop technologies for actuating smart garments and textiles. This paper introduces ShapeTex, a thermal shape changing fabric that uses laminate thermal expansion to actuate textiles. We present the design process and rationale for ShapeTex; we explain the fabrication process we have developed for making ShapeTex accessible to fashion designers and interaction designers. Based on co-creation sessions with designers we discuss requirements derived from this material. Finally we present a number of concept prototypes created to explore and illustrate the potential applications of ShapeTex.

    • zPatch: Hybrid Resistive/Capacitive eTextile Input

      Paul Strohmeier, University of Copenhagen
      Jarrod Knibbe, University of Copenhagen
      Sebastian Boring, University of Copenhagen
      Kasper Hornbæk, University of Copenhagen

      We present zPatch: an eTextile patch for hover, touch, and pressure input, using both resistive and capacitive sensing. zPatches are made by layering a piezo-resistive material between silver-plated Ripstop, and embedding it in non-conductive fabric to form a patch. zPatches can be easily ironed onto most fabrics, in any location, enabling easy prototyping or ad hoc-modifications of existing garments. We provide open-source resources for building and programming zPatches and present measures of the achievable sensing resolution of a zPatch. A pressure based targeting task demonstrated users could reliably hit pressure targets at up to 13 levels, given appropriate feedback. We demonstrate that the hybrid sensing approach reduces false activations and helps distinguish between gestures. Finally, we present example applications in which we use zPatches for controlling a music player, text entry and gaming input.

    • COLORISE: Shape- and Color-Changing Pixels with Inflatable Elastomers and Interactions

      Juri Fujii, Keio University
      Takuya Matsunobu, Keio University
      Yasuaki Kakehi, Keio University

      We propose a new method for shape- and color-changing display called COLORISE. Our “COLORISE” system has inflatable shape changing pixels that can change their colors without using any light-emitting type devices. The array of the modules enables various color patterns, making full use of the characteristics of the material. Each pixel also has a touch sensing function that enables users to interact with it intuitively. This paper describes the design and mechanism of our system, explores interactions with users, and presents technical evaluations of the proposed pixel modules.

    • Plux: Exploring Light Settings through Hybrid Control

      Tom van Rooij, Eindhoven University of Technology
      Saskia Bakker, Eindhoven University of Technology

      Connected lighting solutions are replacing traditional lightbulbs in the home environment and offer unlimited numbers of light settings, varying in color, brightness, saturation and more. Interaction design for such systems is challenging: simple switches no longer suffice, and asking users to specify all parameters of all lightbulbs separately is time-consuming and requires them to understand the desired outcome. This paper presents Plux, a tangible light controller which employs a hybrid control approach. Plux generates light settings of pseudo-randomized color palettes, while users control the saturation and brightness directly. Plux is intended to enable serendipitous exploration of new light settings using a peripheral interaction design approach. A field deployment in which 5 people used Plux for 3 weeks in their homes revealed that using Plux easily became a part of the everyday routine. Plux’s hybrid control approach was particularly valuable for users who struggle to determine the desired light setting.

    • Screenprinting and TEI: supporting engagement with STEAM through DIY fabrication of smart materials

      Stacey Kuznetsov, Arizona State University
      Piyum Fernando, SANDS Group at the School of Arts, Media and Engineering
      Emily Ritter, SANDS Group at the School of Arts, Media and Engineering
      Cassandra Barrett, SANDS Group at the School of Arts, Media and Engineering
      Jennifer Weiler, Arizona State University
      Marissa Rohr, School of Arts Media Engineering

      This paper focuses on manual screenprinting as a DIY fabrication technique for embedding interactive behavior onto a rage of substrates such as paper, fabric, plastic, wood, or vinyl. We frame screenprinting as a process that operates at the intersection of art, technology, and material science and iteratively examine its potential in two STEAM contexts. We conducted youth and adult workshops whereby participants worked with our low-cost thermochromic, UV-sensitive, and conductive screenprinting inks to develop a range of concepts and final projects. Our findings highlight several unique features of screenprinting: it affords a low barrier to entry for smart material fabrication, supports a collaborative maker practice, and scaffolds creative engagement with STEAM concepts. By being widely-accessible and substrate-agnostic, screenprinting presents exciting opportunities for TEI: DIY fabrication of smart materials in domains such as fine arts, information visualization, and slow technology; and bridging diverse disciplines through STEAM screenprinting initiatives at youth and adult levels.

    • W.O.U.S.: Widgets of Unusual Size

      Zann Anderson, Brigham Young University
      Michael Jones, Brigham Young University
      Kevin Seppi, Brigham Young University

      Recent work in tangible interfaces, including widget sets like .NET Gadgeteer and Phidgets, has enabled prototyping of rich physical interaction at a handheld or tabletop scale. But it remains unclear how participants respond to physical widgets at larger scales. What kinds of interaction would larger widgets enable, and what kinds of systems – if any –
      can or should be built with them? We built unusually-sized widgets, or “mega-widgets” in order to explore this territory. We present the results of two iterations of building mega-widgets and accompanying user studies designed to help understand participants’ reactions to mega-widgets and probe possible applications. Responses indicated, among other things, a correlation between widget size and the perceived size or importance of what it might control. Mega-widgets were also perceived as increasing the precision of user input control and providing a fun and playful element. We hope that knowledge gained from this exploratory work can help lay groundwork for further exploration of widgets at larger scales.

    • Tailor-made Accessible Computers: An Interactive Toolkit for Iterative Co-Design

      Florian Güldenpfennig, TU Wien

      An increasing number of people with little experience in technology desire to use the Internet and web related services like social networking or online information research. However, the way most computers are designed today does not allow many older people to participate in this key technology due to their lack in computer-literacy compared with younger generations. Age-related impairments and disabilities can further complicate or deny the use of conventional computers. To remove existing barriers, we created a series of tailor-made computers, which met specific needs like improved accessibility or particular aesthetics. To accomplish this, we used a co-design toolkit that we created (a) to provide senior users with early tangible experiences of their future systems and (b) to iteratively convert it into the final implementation. In this paper, we demonstrate both the toolkit and the resulting designs of accessible Internet computers.

  • 16.30 – Work in Progress Session 1

      Time: 16.30 – 18.00, Location: Hyllan

    • The Cuebe: Facilitating Playful Early Intervention for the Visually Impaired

      Peter Fikar, TU Wien
      Florian Güldenpfennig, TU Wien
      Roman Ganhör, TU Wien

      Cortical Visual Impairment in children is a severe issue caused by prenatal injury of the brain affecting timely development during childhood. Therapists work with affected children to foster general development, to improve their learning, and to train sensory skills. In order to support effective and playful practices in Early Intervention we designed The Cuebe and present a working prototype. It is a tangible device that can detect and project colors, allowing therapists to “magnify” those colors and creating a vast variety of playful interactions for and with children suffering from low vision. The exploration of the design domain and the iterative development of The Cuebe were driven by a co-design approach involving four therapists and 12 affected children. We describe use-cases from the field, illustrate The Cuebe’s potential in Early Intervention sessions and discuss further improvements and directions of future developments.

    • Soft-bodied Fidget Toys: A Materials Exploration

      Peter Cottrell, UC Santa Cruz
      April Grow, UC Santa Cruz
      Katherine Isbister, University of California Santa Cruz

      We present an exploration of e-textile/soft materials that can be used to capture fidget traces, while providing the touch sensations that fidgeters report seeking out in everyday fidget objects [3,4,5]. We created two soft-bodied “sampler” objects with a range of smart fidgeting affordances, which we describe in this paper, along with a general outline of the range of properties explored. This work extends exploration of a novel design space introduced in [4], toward the end goal of creating smart fidget objects that aid self-regulation. We include an overview of our design process, present some preliminary insights about materials that support this design space, and conclude with current and future directions for the work.

    • NewsThings: Exploring Interdisciplinary IoT News Media Opportunities via User-Centred Design

      John Mills, University of Central Lancashire
      Mark Lochrie, University of Central Lancashire
      Tom Metcalfe, Thomas Buchanan
      Peter Bennett, University of Bristol

      Utilising a multidisciplinary and user-centred product and service design approach, ‘NewsThings’ explores the potential for domestic and professional internet of things (IoT) objects to convey journalism, media and information. In placing news audiences and industry at the centre of the prototyping process, the project’s web connected objects explore how user requirements may be best met in a perceived post-digital environment. Following a research-through-design methodology and utilising a range of tools – such as workshops, cultural probes, market research and long-term prototype deployment with public and industry, NewsThings aims to generate design insights and prototypes that could position the news media as active participants in the development of IoT products, processes and interactions. This work-in-progress paper outlines the project’s approach, methods, initial findings – up to and including the pre-deployment phase – and focuses on novel insights around user-engagement with news, and the multidisciplinary team’s responses to them.

    • Mechamagnets: Tactile Mechanisms with Embedded Magnets

      Clement Zheng, University of Colorado
      Ellen Yi-Luen Do, National University of Singapore, ATLAS Institute

      This paper presents Mechamagnets, a technique to rapidly prototype tactile mechanisms for tangible interfaces. We demonstrate how to embed different passive tactile mechanisms in physical systems through a combination of magnets and digitally fabricated parts. We also discuss how DIY materials and 3-axis magnetometers can instrument Mechamagnet interfaces into functional prototypes.

    • The Evolving Design of Tangibles for Graph Algorithmic Thinking

      Andrea Bonani, Free University of Bozen-Bolzano
      Vincenzo Del Fatto, Free University of Bozen-Bolzano
      Rosella Gennari, Free University of Bozen-Bolzano

      Algorithmic thinking is at the core of computational thinking. Tangible interactive solutions can help children develop algorithmic thinking skills. This paper focusses on exploratory research concerning tangibles for graph algorithmic thinking for primary and middle schools. By following an action-research process, tangibles evolved through prototyping and actions-studies. The paper overviews their evolution and delves into its most recent action: an ecological study with 8 middle school children, and 5 primary school children, using tangibles for graph algorithmic thinking. It ends by reflecting on results and future work.

    • The Artistic Potential of Tactile Vision Interfaces: A First Look

      Antal Ruhl, Avans University of Applied Sciences
      Alwin de Rooij, Tilburg University
      Michel van Dartel, Avans University of Applied Sciences

      Interfaces that subvert, substitute, or augment the relationship between acting and sensing make interesting tools for experimentation in artistic research. Amongst them, tactile vision interfaces, which enable people to experience to “see with their skin”, have particular appeal. However, surprisingly few such interfaces are used by artists. In the present work we take a first look at how the artistic potential of tactile vision interfaces could be unlocked. On the basis of related artistic work, we argue that an accessible, generalizable and learnable interface is needed to enable artists to explore the possibilities of tactile vision. Subsequently, we present ongoing research that revolves around the development of such a tactile vision interface. A case study is presented that informs the development of a new prototype of this interface that overcomes two key limitations in its purpose to unlock the artistic potential of tactile vision.

    • Embodied Interactions with E-Textiles and the Internet of Sounds for Performing Arts

      Sophie Skach, Queen Mary University of London
      Anna Xambó, Queen Mary University of London
      Luca Turchet, Queen Mary University of London
      Ariane Stolfi, Queen Mary University of London
      Rebecca Stewart, Queen Mary University of London
      Mathieu Barthet, Queen Mary University of London

      This paper presents initial steps towards the design of an embedded system for body-centric sonic performance. The proposed prototyping system allows performers to manipulate sounds through gestural interactions captured by textile wearable sensors. The e-textile sensor data control, in real-time, audio synthesis algorithms working with content from Audio Commons, a novel web-based ecosystem for re-purposing crowd-sourced audio. The system enables creative embodied music interactions by combining seamless physical e-textiles with web-based digital audio technologies.

    • Designing a Smart Reading Environment with and for Children

      Pedro Ribeiro, Rhine-Waal University of Applied Sciences
      Anna Michel, University of Applied Science
      Ido Iurgel, Rhine-Waal University of Applied Sciences
      Christian Ressel, Rhine-Waal University of Applied Sciences
      Cristina Sylla, University of Minho
      Wolfgang Mueller, Univ. of Education Weingarten

      In this paper, we present an ongoing project named STREEN (Story Reading Environmental Enrichment). The project explores a concept of augmented reading focusing on a smart environment that is able to automatically trigger digital media enrichments depending on the reading performance and the narrative. By creating an engaging and immersive reading experience, STREEN has the potential to support early readers. In order to understand how the story reading activities can be enriched through digital media, as well as the effect of such enrichments on the reading experience, the design and development of STREEN follows a Design Based Research approach, involving teachers and primary school children along the design process. In this paper, we describe the first steps towards the construction of a smart reading environment for primary school children.

    • Kniwwelino: A Lightweight and WiFi Enabled Prototyping Platform for Children

      Valerie Maquil, LIST
      Christian Moll, LIST
      Lou Schwartz, LIST
      Johannes Hermen, LIST

      Nowadays, computational thinking skills are considered as fundamental for our future daily life and many initiatives and tools are created to foster these skills. In this paper, we present the Kniwwelino, a new platform for prototyping physical computing projects based on WiFi. The novelty of our solution lies in the use of a WiFi chip on a small, extendable board, programmable via a block based visual programming language, making the platform compact, low-cost, WiFi enabled, and accessible to children. This paper presents the design rationale and implementation of the platform as well as two simple, example projects making use of the new WiFi-based functionalities.

    • CRISPEE: A Tangible Gene Editing Platform for Early Childhood

      Clarissa Verish, Wellesley College
      Amanda Strawhacker, Tufts University
      Marina Bers, Tufts University
      Orit Shaer, Wellesley College

      We present CRISPEE, a novel tangible user interface designed to engage young elementary school children in bioengineering concepts. Using CRISPEE, children assume the role of a bioengineer to create a genetic program that codes for a firefly’s bioluminescent light. This is accomplished through sequencing tangible representations of BioBricks, which code for the primary colors of light (red, green, and blue) to be turned on or off. The interface and curricular supplement expose children in early elementary school to concepts traditionally taught much later in school curricula through playful interaction and exploration. We discuss CRISPEE’s concept and design, and share findings from its preliminary evaluation with children and adults.

    • Exploring Around-Device Tangible Interactions for Mobile Devices with a Magnetic Ring

      Victor Cheung
      Audrey Girouard

      We present our initial work on using a magnetic ring as a tangible input control to support around-device interactions for mobile devices, aiming to address touchscreen issues such as occlusion and imprecision, at the same time provides tangibility to interaction. This input mechanism allows users to use the surface on which the mobile device is placed as an extended input space, with a rotatory motion as the main form of interaction. Our technique requires no calibration, no modifications to the mobile device, and no external power for the ring, which also functions as an accessory item (a finger ring) when not in use. We discuss our design criteria, prototype implementation and illustrative applications, and directions for future work.

    • Investigation of Touch Interfaces Using Multilayered Urushi Circuit

      Koshi Ikegawa, University of Tsukuba
      Shuhei Aoyama, University of Tsukuba
      Shogo Tsuchikiri
      Takuto Nakamura
      Yuki Hashimoto
      Buntarou Shizuki, University of Tsukuba

      Urushi (Japanese lacquer) is a natural resin paint with electrical insulating capability. By using it as a base material and coating material for electronic circuits, it is possible to construct a circuit with an elegant appearance and feel. It is also possible to build a multilayered electronic circuit by using urushi as insulation layers. In this study, we investigate techniques to construct touch interfaces using a multilayered electronic circuit (urushi circuit). At first, we fabricated an urushi circuit with a touch electrode consisting of two layers. To improve its appearance, we fabricated urushi circuits in which all touch electrodes are arranged on the top layer and all wires are hidden in the bottom layer. Moreover, as an extension of the touch interface, we built a grid of touch electrodes that realizes two-dimensional touch sensing.

    • Tangible Interaction in the Dentist Office

      Frode Guribye, University of Bergen
      Tor Gjøsæter, Independent

      This paper presents the design efforts involved in making a system for supporting haptic communication between dentist and patient during dental treatment. We describe Grasp Live, a haptic interaction technology consisting of a tangible stone-like object connected to a vibro-tactile feedback device.

    • Paper Circuitry and Projection Mapping: An Interactive Textbook Approach to Veterinary Education

      Margaret Cook, Texas A&M University
      Jinsil Seo, Texas A&M University
      Michelle Pine, Texas A&M University
      Annie Sungkajun, Texas A&M University

      We present an interactive book platform that was initially developed for veterinary students. This platform provides a simple tangible book interface that users can touch, flip through pages, and read the content. Meanwhile, they experience a dynamic content display system using a projector. In this paper, we present an interactive book which specifically covers Cattle (Bovine) laminitis, a painful and widely prevalent disease in cattle. Our project aims to support students’ understanding of the multi-faceted nature of bovine laminitis, through the use of an interactive book interface. We implemented the unique approach of using paper circuitry to create the electrical pathway from an actual book to the computer. Book pages with embedded paper circuitry become an interactive surface, that visualizes a progression of bovine laminitis through its stages of disease. Future development of the work will include developing content from other disciplines for the same platform.

    • Designing an Expandable Illuminated Ring to Build an Actuated Ring Chart

      Maxime Daniel, ESTIA
      Guillaume Rivière, ESTIA
      Nadine Couture, ESTIA

      Data physicalizations are growing popular in many societal domains which indicates a strong potential for fostering public engagement thanks to public exhibition (e.g. in train stations, in airports, on roundabouts, and in enterprises). We focus on dynamic physical charts for visualizing renewable energy forecasts in public spaces. To get charts readable from any point of view around, we propose a physical ring chart inspired by stone cairns. To build this physical chart, we designed an expandable and stackable illuminated ring. In this paper, we describe the design process and the limitations of the first prototype of such a ring.

  • 18.00 – Performance: Opto-Phono-Kinesia

    Time: 18.00 – 19.00, Location: Main Hall

    Steve Gibson, Northumbria University

    Opto-Phono-Kinesia (OPK) is an audio-visual performance piece in which all media elements are controlled by the body movements of a single performer. The title is a play on a possible synesthetic state involving connections between vision, sound and body motion. Theoretically, for a person who experiences this state, a specific colour could trigger both a sound and a body action. This synesthetic intersection is simulated in OPK by simultaneity of body movement, and audio-visual result. Using the Gesture and Media System 3.0 motion-tracking system, the performer can dynamically manipulate an immersive environment using two small infrared trackers. The project employs a multipart interface design based on a formal model of increasing complexity in visual-sound-body mapping, and is therefore best performed by an expert performer with strong spatial memory and advanced musical ability. OPK utilizes the “body as experience, instrument and interface” [1] for control of a large-scale environment. 1. ACM. TEI Arts Track website. 2017. Retrieved October 11, 2018 from https://tei.acm.org/2018/arts-track/

Tue, 20 March

All Tuesday program events except for the Arts Reception and Conference Dinner will take place at Nymble at KTH Royal Institute of Technology. For a map of individual rooms and accessibility access, please visit the Venue page. The Arts Reception and Conference Dinner will take place at Kulturhuset. For directions, please visit the TEI 2018 Google Map.

  • 8.30 – Registration

    Time: 8.30 – 9.30, Location: Puben

  • 9.30 – Student Design Challenge

      Time: 9.30 – 10.30, Location: Hyllan

    • An ambient communication system that sensitizes for the own loudness in working spaces

      Christina Blank, University of Applied Sciences Osnabrück
      Elena Gimmel, University of Applied Sciences Osnabrück
      Sebastian Winter, University of Applied Sciences Osnabrück
      Neele Rittmeister, University of Applied Sciences Osnabrück

      Today’s office spaces tend to change into a more open working environment to increase collaborations between co-workers. Removing visual barriers to improve communication is also causing acoustic problems making noise pollution a growing problem.
To address this problem, we introduce “Sone”, an ambient communication system that aims to reduce noise pollution by controlling the Wi-Fi signal. 
Sone is capable of either strengthening or blocking the signal based on the noise that is recorded.
Our intention is to sensitize people towards their own volume at work by affecting something important as the internet access.

    • Digitus Touch, Tattoo-Based Bodily Interaction

      Robbert Troost, Södertörn University
      Anna Tunek, The School of Natural Sciences, Technology and Environmental Studies
      Fredrik Otterstål, Södertörns högskola
      Erik Gustavsson, Södertörns Högskola

      With the future of ubiquitous computing in mind, we set out to design a prototype that would keep the power in our hands, literally. Inspired by recent developments in bodily interaction, microprocessors and epidermal electronics, Digitus Touch is our vision of tattoo-based technology that will allow users to interact with the devices in their lifestyle all through a simple tap of the finger. It is highly customizable in both its look and functionality thanks to the proposed mobile companion app. Whether to enhance your workout schedule, enable you to create music on the fly, or make your presentations run smoother, DT can be applied to great value. This design proposal shares our vision of the future, as well as our process to create the initial, working prototype.

    • Pain-catcher: Experiencing Pain in the future

      Dina Yassine, The School of Natural Sciences, Technology and Environmental Studies
      Maria Laura D’have, The School of Natural Sciences, Technology and Environmental Studies
      Olga Germanovica, The School of Natural Sciences, Technology and Environmental Studies
      Sepideh Abbaszadeh, The School of Natural Sciences, Technology and Environmental Studies

      We developed a concept of a future pain detection and manipulation device, based on receiving and blocking of pain signals transmitted through the spinal cord to the brain. The proposed design is a system of two wearables called Pain-catcher—a patch and a wristband which are connected to each other. In order to materialize our concept, we built a physical prototype that demonstrates the functions of transmitting and receiving signals between the wearables. Subsequently we created a video prototype to highlight our design implications and to demonstrate how the concept could be set within a used case.

    • Designing an Intuitive Dynamic Telepresence System with Tangible Interface

      Graham Smith, University College Dublin
      Sam Nemeth, Northumbria University

      The team is designing an intuitive telepresence system that offers a dynamic user experience by means of a tangible control –with haptic feedback. The team uses Video Prototyping as a tool to get an understanding of strengths and weaknesses of the design. The authors show the first working prototype physically and
      a. A Video Prototype: enacting the functionality and user experience of a TUI
      b. A video where test persons play out the experience of the system

    • Interactive Event Map

      Nicolaas Bijman, KTH
      Jiayao Yu, KTH
      André Josefsson, CSC
      Jim Tolman, KTH Royal Institute of Technology

      The interactive event map gives event organizers a novel way to promote their events to people in public spaces. Users physically interact with the installation to receive information about events that are occurring at different venues in the city. Our intended target group are people in public spaces, with a particular focus on travelers and visitors that are new to the city. We think they will be most interested in participating in events that the map can inform them about. We strived to create an aesthetic tangible user interface using computational bits and wooden materials. We reflect upon the feedback we received from our users during a public event where we exhibited the event map.

    • ICOS: Interactive Clothing System

      Hans Brombacher, Eindhoven University of Technology
      Selim Haase, Eindhoven University of Technology

      New techniques are being developed and are influencing our daily life and environment. These influences take place in several interaction-attention fields. The challenge for designers is to develop new products that use these new techniques, that improve the experience of the user. We designed ICOS; a Smart, Interactive and Adaptive clothing rack, which presents the advantages of machine learning when implemented in our daily lives. The machine learning was developed and implemented within a clothing rack that uses weather data (type of weather and temperature) and frequency of the previous selected clothes to advise the user. This is to improve and enrich the daily routine of picking clothes of users.

    • Aero- a smart ventilation interface

      Martina Nina Eriksson, Umea Institute of Design
      Shibashankar Sahoo, Umeå University
      Carolyn Wegner, Umea Institute of Design

      Our current understanding of the translation of subjective data into numbers in user interfaces has limited our capacity to perceive the intrinsic meaning associated with data and its source. Here, we envision Aero, a tangible interaction framework that connects the user and data source by combining data physicalization and tangible manipulation. We illustrate the application of this framework through a case study on a critical context of an anesthesia ventilator machine, where the connection between data source, i.e patient, and user, i.e nurse, is vital. This interface enables the users to monitor, communicate, and manipulate the tangible data in real time. This thereby establishes a deeper connection between the data source and the user. Lastly, we analyze the challenges, limitations and future opportunities of this approach in this professional context through two prototypes – fabric, jellyfish.

    • Post Food: Looking at Sustainability Through Design Futures

      Simran Chopra, Northumbria university

      Current food consumption patterns are unsustainable. They are a result of the influence of politics, economics and sociocultural constructs. Food is an everyday mundane that we need to decide on three or four times in a day. This decision rather than building on an informed choice is built on the complexity of demand and supply, governed by the principles of industrial revolution. In a post digital, “post food” scenario, we propose a fictional technology called ‘Essen’. ‘Essen’ has the capacity to sustain a human being without the need to eat food. In presenting it, our aim is to question the trajectories of food through design fiction to understand current food practices better and to broaden our thinking on sustainable food futures.

    • Embodiment of machine olfaction: The Braitenberg Nose

      Marc van Almkerk, Kungliga Tekniska Högskolan
      Rui Li, Kungliga Tekniska Högskolan
      Nicola Marcon, Kungliga Tekniska Högskolan

      There is an increasing attention towards machine olfaction and many studies show useful application through this type of technology, both for industry as well as for our lives. To further research in this area from an interaction point of view, this paper presents the Braitenberg Nose, an artifact that explores how olfactory devices can be implemented in our lives through a tangible form. Users can present various smells to the nose that, based on the chemical composition and smell intensity, will show different movements through its nostrils and the sides of the nasal bridge.

  • 9.30 – Graduate Student Consortium Poster Session

      Time: 9.30 – 10.30, Location: Hyllan

    • Exploring the Potential of Data Physicalization for STEM Learning

      Sarah Hayes, Dept. of Media Communications, Cork Institute of Technology

      This paper presents an overview of the research that I have conducted as part of my PhD studies. The aim of this research is to explore the potential of data physicalization to enhance and improve STEM learning. Specifically, this research is focused on exploring the ways in which physical data representations can be used to clarify dense scientific, technological, engineering or mathematical concepts, and support novel interactions to increase user engagement and learning. My aim is to achieve this through the design, deployment and evaluation of a series of data physicalizations representing STEM concepts and datasets.

    • Designing for Instructed Physical Training

      Laia Turmo Vidal, Dept of Informatics and Media, Uppsala University

      The use of technology to assist in instructed physical training in collocated, social settings remains underexplored. In this short paper, I present an overview of my PhD research, which focuses on designing for supporting these activities and opening up their design space by taking into consideration the spatial, physical and social contexts in which they unravel. Following a Research Through Design approach, I conduct a series of design explorations to investigate how to ecologically design technological support for in-the-moment instructed physical training.

    • Understanding and Designing Embodied Experiences Through Mid-air Tactile Stimulation

      Dario Pittera, SCHI Lab, Department of Informatics, University of Sussex

      Current attempts to render touch in multimedia technology
      still represents a challenge. Touch is indeed a complex system
      and there are many aspects to take into account when trying
      to rendering it (e.g. the compliance of an object, its weight,
      orientation, geometric properties and forces on the skin). This
      is especially true for VR, where touch is an important factor
      to achieve the embodiment in virtual environments. Recently,
      new tactile technology has been developed: the mid-air de-
      vices, capable of delivering tactile feedback without entering
      in contact with the skin. One of the contributions of the doc-
      toral research described in this paper is to overcome design
      challenges and create immersive experiences by applying psy-
      chological principles and paradigms, exploiting the advantages
      of the mid-air technology. We designed possible embodied
      interaction scenarios in mid-air and physical touch. Findings
      from these research point to opportunities for designing new
      immersive experiences. Future work will involve different
      parts of the body and different tactile properties (e.g. thermal
      stimulation).

    • The Interactive Carpet – Smart Textile Interface for Children on Autism Spectrum Disorder

      Yulia Zhiglova, School of Digital Technologies, Tallinn University

      This paper discusses the socializing potential of the smart textile based interface and presents the design and future study method of the Interactive Carpet prototype, designed in cooperation with Autism Spectrum Disorder (ASD) specialists. The future exploratory study will shed a light on the potential effect on social skills of children with ASD and how its physical properties and multi-sensory feedback could promote better interaction between caregiver and a child with ASD.

    • In-Forming Textile: Investigating Shape Change and Material Memory through an Embodied Approach

      Alice Rzezonka, School of Design and Art, University of Wuppertal

      Human material interaction is embedded in an experience-based embodied context, which is difficult to verbalize and abstract. However, new computational composites are often
      conducted and presented in a laboratory setting absent from human embodied involvement. This research project aims at exploring the link between textile shape change and material memory through an embodied approach. This is done by incorporating a performative collective of dancers and artists into the development process of a shape changing textile membrane, while at the same time collecting and analyzing data of the process. It is expected that out of the rich insight gained a new perspective on the relationship between memory, material, movement and body emerges.

    • Creating and Staging Interactive Costumes

      Michaela Honauer, HCI Group, Bauhaus-Universität Weimar

      A growing part of HCI is research on wearables, e-textiles and performances with technology. However, conventional theatre productions underlie traditionally grown structures and have predefined production process that do not allow for in-house creations of interactive costumes. This research project addresses this issue and tries to figure out how this cultural sector and traditional costume design could engage with contemporary e-textile and wearable technologies.

    • Unfolding an Industrial Design Approach to Physical Computing

      Clement Zheng, ATLAS Institute, University of Colorado

      Industrial designers working to develop tangible interactive systems require new skills to integrate technology into their prototypes. My research investigates how to facilitate physical computing for industrial design students in a manner that is in line with the practices and values of the profession. Different toolkits have been developed through this research to scaffold industrial designers as they make tangible interactive artifacts. Through investigating this toolkits in the wild, I aim to develop an approach to inform educators as they structure industrial design programs to support students in design tangible interactive systems.

  • 10.30 – Session 3: Toys, Talk and Play

      Time: 10.30 – 12.00, Location: Main Hall

      Jörn Hurtienne

    • Smart Toys Design Opportunities for Measuring Children’s Fine Motor Skills Development

      Svetlana Mironcika, Amsterdam University of Applied Sciences
      Antoine de Schipper, Amsterdam University of Applied Sciences
      Annette Brons, Hogeschool van Amsterdam
      Huub Toussaint, Hogeschool van Amsterdam
      Ben Kröse, Amsterdam University of Applied Sciences
      Ben Schouten, Technische Universiteit Eindhoven, Amsterdam University of Applied Sciences

      Smart tangible toys, designed for hand manipulation, can transform fine motor skills assessment into enjoyable activities which are engaging for children to play (partially) unsupervised. Such toys can support school teachers and parents for early detection of deficiencies in motor skills development of children, as well as objectively monitor the progress of skills development over time. To make a game enjoyable for children with different skills level, these smart toys could offer an adaptive game play. In this paper we describe the design and deployment of a digital board game, equipped with sensors, which we use to explore the potential of using smart toys for fine motor skills assessment in children.

    • ClassBeacons: Designing Distributed Visualization of Teachers’ Physical Proximity in the Classroom

      Pengcheng An, Eindhoven University of Technology
      Saskia Bakker, Eindhoven University of Technology
      Sara Ordanovski, Department of Experimental Psychology & Helmholtz Institute
      Ruurd Taconis, Eindhoven University of Technology
      Berry Eggen, Eindvhoen University of Technology

      As necessary for creating a learner-centered environment, nowadays teachers are expected to be more mindful about their proximity distribution: how to spend time in different locations of the classroom with individual learners. However feedback on this is only given to teachers by experts after classroom observation. In this paper we present the design and evaluation of ClassBeacons, a novel ambient information system that visualizes teachers’ physical proximity through tangible devices distributed over the classroom. An expert review and a field evaluation with eight secondary school teachers were conducted to explore potential values of such a system and gather user experiences. Results revealed rich insights into how the system could influence teaching and learning, as well as how a distributed display can be seamlessly integrated into teachers’ routines.

    • Evolving Tangibles for Children’s Social Learning through Conversations: Beyond TurnTalk

      Rosella Gennari, Free University of Bozen-Bolzano
      Alessandra Melonio, Free University of Bozen-Bolzano
      Mehdi Rizvi, Free University of Bozen-Bolzano

      Social learning curricula teach children social norms for managing conversations, such as the norm of not overlapping in talking turns. Interactive tangible objects (briefly, tangibles) can help teachers in the scaffolding of such norms. Such tangibles should be created for the specific social learning contexts of their users, and evolve according to their requirements. An ideal design process for tangibles for children’s conversations is meta-design, based on action-research: evolving tangible prototypes are developed, with natural material and easy-to-use micro-electronics components; tangibles are adopted by their users in ecological studies, and their usage is reflected over with designers to stir design directions or uncover design possibilities, which are developed and again used by users. This paper reports on such an evolutionary design process, concerning tangibles for children’s conversations. It shows how new design ideas emerged by making users adopt design solutions and moving designers into ecological settings.

    • Come and Play: Interactive Theatre for Early Years

      Roma Patel
      Holger Schnädelbach
      Boriana Koleva

      The convergence of theatre and digital technologies can play a valuable role in theatre for early years, but, how an audience of under-5’s experiences and engages with these spaces is largely unexplored. We present an interactive performance installation and demonstrate how concepts from early years practice, in particular schemas, children’s repeated play patterns, can be used as a design framework. We integrated sensors and microcontrollers into objects, puppets, and scenography and invited eight groups of very young children and their grownups to explore the performance. We discuss how schemas are useful as a design and analysis tool in TEY, how schemas need to be expanded to include multi-sensory interactions with hybrid physical-digital objects, and how designers need to consider the roles of adults who scaffold interaction between very young children and their surroundings.

  • 12.00 – Lunch

    Time: 12.00 – 13.30, Location: Matsalen

  • 13.30 – Session 4: Sensing in Virtual and Augmented Reality

      Time: 13.30 – 15.00, Location: Main Hall

      Stacey Kuznetsov

    • Sensory VR: Smelling, Touching, and Eating Virtual Reality

      Daniel Harley, York University
      Alexander Verni, Ryerson University
      Mackenzie Willis, Ryerson University
      Ashley Ng, Ryerson University
      Lucas Bozzo, Ryerson University
      Ali Mazalek, Ryerson University

      We present two proof of concept sensory experiences designed for virtual reality (VR). Our experiences bring together smell, sound, taste, touch, and sight, focusing on low-cost, non-digital materials and on passive interactions. We also contribute a design rationale and a review of sensory interactions, particularly those designed for VR. We argue that current sensory experiences designed for VR often lack a broader consideration of the senses, especially in their neglect of the non-digital. We discuss some implications of non-digital design for sensory VR, suggesting that there may be opportunities to expand conceptions of what sensory design in VR can be.

    • “You Better Eat to Survive”: Exploring Cooperative Eating in Virtual Reality Games

      Peter Arnold, RMIT University, LMU
      Rohit Ashok Khot, RMIT University
      Florian Mueller, RMIT University

      “You Better Eat to Survive” is a two-player virtual reality game that involves eating real food to survive and ultimately escape from a virtual island. We sense eating actions of players by analyzing chewing sounds captured by a low-cost microphone attached to the players’ cheek. Our interest in using cooperative eating as a way of interacting in virtual reality is driven by the possibilities of creating a cross-modal gameplay experience that benefits social interactions. A user study with 22 players showed that eating real food improved players’ feeling of presence, challenged trust dependencies and made the survival aspect of the game feel more “real”. We use these insights to articulate three design themes that can guide designers in creating virtual reality games that incorporate cooperative eating. Ultimately, our work aims to guide design thinking towards using underexplored interaction methods in virtual reality games, thereby reiterating the post-digital design theme of TEI 2018.

    • Situated Game Level Editing in Augmented Reality

      Gary Ng, KAIST
      Joon Gi Shin, KAIST
      Alexander Plopski, Nara Institute of Science and Technology
      Christian Sandor, Nara Institute of Science and Technology
      Daniel Saakes, KAIST

      Level editors let end-users create custom levels and content within a given video game. In this paper, we explore the concept and design of Augmented reality game level editors. These new types of editors are not only spatial and embodied, but also situated, as they enable users to tailor games to the unique characteristics and emotional value of their own space. We present the design and implementation of a prototype level editor that runs on the Microsoft HoloLens. The editor enables users to add virtual content in their homes and add interactions through spatial trigger-action game-logic programming. We had pairs of students create games with the prototype and play each other’s games. They reported that games are fun to make, play, and watch others play. Based on the design and evaluation, we propose guidelines for Augmented reality game-authoring tools for end users.

    • Ani-Bot: A Modular Robotics System Supporting Creation, Tweaking, and Usage with Mixed-Reality Interactions

      Yuanzhi Cao, Purdue University
      Zhuangying Xu, Purdue University
      Terrell Glenn, Purdue University
      Ke Huo, Purdue University
      Karthik Ramani, Purdue University

      Ani-Bot is a modular robotics system that allows users to control their DIY robots using Mixed-Reality Interaction (MRI). This system takes advantage of MRI to enable users to visually program the robot through the augmented view of a Head-Mounted Display (HMD). In this paper, we first explain the design of the Mixed-Reality (MR) ready modular robotics system, which allows users to instantly perform MRI once they finish assembling the robot. Then, we elaborate the augmentations provided by the MR system in the three primary phases of a construction kit’s lifecycle: Creation, Tweaking, and Usage. Finally, we demonstrate Ani-Bot with four application examples and evaluate the system with a two-session user study. The results of our evaluation indicate that Ani-Bot does successfully embed MRI into the lifecycle (Creation, Tweaking, Usage) of DIY robotics and that it does show strong potential for delivering an enhanced user experience.

  • 15.00 – Demo Session 2: Kids, Art, Music, Theatre, Virtual and Augmented Reality

      Time: 15.00 – 16.30, Location: Hyllan

    • Smart Toys Design Opportunities for Measuring Children’s Fine Motor Skills Development

      Svetlana Mironcika, Amsterdam University of Applied Sciences
      Antoine de Schipper, Amsterdam University of Applied Sciences
      Annette Brons, Hogeschool van Amsterdam
      Huub Toussaint, Hogeschool van Amsterdam
      Ben Kröse, Amsterdam University of Applied Sciences
      Ben Schouten, Technische Universiteit Eindhoven, Amsterdam University of Applied Sciences

      Smart tangible toys, designed for hand manipulation, can transform fine motor skills assessment into enjoyable activities which are engaging for children to play (partially) unsupervised. Such toys can support school teachers and parents for early detection of deficiencies in motor skills development of children, as well as objectively monitor the progress of skills development over time. To make a game enjoyable for children with different skills level, these smart toys could offer an adaptive game play. In this paper we describe the design and deployment of a digital board game, equipped with sensors, which we use to explore the potential of using smart toys for fine motor skills assessment in children.

    • Evolving Tangibles for Children’s Social Learning through Conversations: Beyond TurnTalk

      Rosella Gennari, Free University of Bozen-Bolzano
      Alessandra Melonio, Free University of Bozen-Bolzano
      Mehdi Rizvi, Free University of Bozen-Bolzano

      Social learning curricula teach children social norms for managing conversations, such as the norm of not overlapping in talking turns. Interactive tangible objects (briefly, tangibles) can help teachers in the scaffolding of such norms. Such tangibles should be created for the specific social learning contexts of their users, and evolve according to their requirements. An ideal design process for tangibles for children’s conversations is meta-design, based on action-research: evolving tangible prototypes are developed, with natural material and easy-to-use micro-electronics components; tangibles are adopted by their users in ecological studies, and their usage is reflected over with designers to stir design directions or uncover design possibilities, which are developed and again used by users. This paper reports on such an evolutionary design process, concerning tangibles for children’s conversations. It shows how new design ideas emerged by making users adopt design solutions and moving designers into ecological settings.

    • Come and Play: Interactive Theatre for Early Years

      Roma Patel
      Holger Schnädelbach
      Boriana Koleva

      The convergence of theatre and digital technologies can play a valuable role in theatre for early years, but, how an audience of under-5’s experiences and engages with these spaces is largely unexplored. We present an interactive performance installation and demonstrate how concepts from early years practice, in particular schemas, children’s repeated play patterns, can be used as a design framework. We integrated sensors and microcontrollers into objects, puppets, and scenography and invited eight groups of very young children and their grownups to explore the performance. We discuss how schemas are useful as a design and analysis tool in TEY, how schemas need to be expanded to include multi-sensory interactions with hybrid physical-digital objects, and how designers need to consider the roles of adults who scaffold interaction between very young children and their surroundings.

    • Sensory VR: Smelling, Touching, and Eating Virtual Reality

      Daniel Harley, York University
      Alexander Verni, Ryerson University
      Mackenzie Willis, Ryerson University
      Ashley Ng, Ryerson University
      Lucas Bozzo, Ryerson University
      Ali Mazalek, Ryerson University

      We present two proof of concept sensory experiences designed for virtual reality (VR). Our experiences bring together smell, sound, taste, touch, and sight, focusing on low-cost, non-digital materials and on passive interactions. We also contribute a design rationale and a review of sensory interactions, particularly those designed for VR. We argue that current sensory experiences designed for VR often lack a broader consideration of the senses, especially in their neglect of the non-digital. We discuss some implications of non-digital design for sensory VR, suggesting that there may be opportunities to expand conceptions of what sensory design in VR can be.

    • Situated Game Level Editing in Augmented Reality

      Gary Ng, KAIST
      Joon Gi Shin, KAIST
      Alexander Plopski, Nara Institute of Science and Technology
      Christian Sandor, Nara Institute of Science and Technology
      Daniel Saakes, KAIST

      Level editors let end-users create custom levels and content within a given video game. In this paper, we explore the concept and design of Augmented reality game level editors. These new types of editors are not only spatial and embodied, but also situated, as they enable users to tailor games to the unique characteristics and emotional value of their own space. We present the design and implementation of a prototype level editor that runs on the Microsoft HoloLens. The editor enables users to add virtual content in their homes and add interactions through spatial trigger-action game-logic programming. We had pairs of students create games with the prototype and play each other’s games. They reported that games are fun to make, play, and watch others play. Based on the design and evaluation, we propose guidelines for Augmented reality game-authoring tools for end users.

    • Ani-Bot: A Modular Robotics System Supporting Creation, Tweaking, and Usage with Mixed-Reality Interactions

      Yuanzhi Cao, Purdue University
      Zhuangying Xu, Purdue University
      Terrell Glenn, Purdue University
      Ke Huo, Purdue University
      Karthik Ramani, Purdue University

      Ani-Bot is a modular robotics system that allows users to control their DIY robots using Mixed-Reality Interaction (MRI). This system takes advantage of MRI to enable users to visually program the robot through the augmented view of a Head-Mounted Display (HMD). In this paper, we first explain the design of the Mixed-Reality (MR) ready modular robotics system, which allows users to instantly perform MRI once they finish assembling the robot. Then, we elaborate the augmentations provided by the MR system in the three primary phases of a construction kit’s lifecycle: Creation, Tweaking, and Usage. Finally, we demonstrate Ani-Bot with four application examples and evaluate the system with a two-session user study. The results of our evaluation indicate that Ani-Bot does successfully embed MRI into the lifecycle (Creation, Tweaking, Usage) of DIY robotics and that it does show strong potential for delivering an enhanced user experience.

    • Through the Glance Mug: A Familiar Artefact to Support Opportunistic Search in Meetings

      Ahmet Börütecene, Koç University
      İdil Bostan, Koç University
      Ekin Akyürek, Koç University
      Alpay Sabuncuoğlu, Koç University
      İlker Temuzkuşu, Koç University
      Çağlar Genç, Koç University
      Tilbe Göksun, Koç University
      Oğuzhan Özcan, Koç University

      During collocated meetings, the spontaneous need for information, called opportunistic search, might arise while conversing. However, using smartphones to look up information might be disruptive, disrespectful or even embarrassing in social contexts. We propose an alternative instrument for this practice: Glance Mug, a touch-sensitive mug prototype that listens to the conversation and displays browsable content-driven results on its inner screen. We organized 15 pairs of one-to-one meetings between students to gather user reflections. The user study revealed that the mug has the potential for supporting instant search and affords sufficient subtlety to conceal user actions. Yet, it provoked some anxiety for the users in maintaining eye contact with their partners. Our main contributions are the context-aware mug concept tested in a real-life setting and the analysis through Hornecker and Buur’s Tangible Interaction Framework that discusses its design space, and its impact on the users and social interaction.

    • Tquencer: A Tangible Musical Sequencer Using Overlays

      Martin Kaltenbrunner, University of Art and Design
      Jens Vetter, University of Art and Design

      This article discusses the design and implementation of the Tquencer, a tangible musical sequencer device. The tangible interaction paradigm for musical sequencers has been explored previously trough the arrangement of simple tangible tokens within time-based physical constraints. While this initial approach allows for the creation of basic rhythmic or melodic loops, it sometimes lacks the musical depth required for professional live performance. Therefore we introduce several additional physical design elements that allow the development of musical complexity, while maintaining the overall simplicity of tangible interaction. These tangible elements include single-step, multi-step and resizable tokens, which can be arranged to trigger musical events, multi-step sound effects and more complex sequence patterns. Furthermore the device also allows the dynamic assignment of the content and behavior of physical tokens through dedicated configuration tokens. We also introduce tangible overlays, which allow the handling and arrangement of multiple musical configurations in various layers. This token configuration and layer management is facilitated by an additional control step, which allows for the overall tangible configuration of a musical setup. Apart from its primary musical application scenario, the Tquencer also provides a generic tangible controller platform for time-based sequencing applications.

    • Real-time Recognition of Guitar Performance Using Two Sensor Groups for Interactive Lesson

      Yejin Shin, Korea University of Technology and Education
      Jemin Hwang, Korea University of Technology and Education
      Jeonghyeok Park, KOREATECH
      Soonuk Seol, Korea University of Technology and Education

      The accurate recognition of guitarist performance is challenging compared with other instruments because a guitar player typically plays several notes at once and uses both hands in different ways. In this paper, we propose a sensor-based guitar that consists of two groups of sensors. One sensor is used to recognize the fingering positions of the fretting hand, and the other is used to detect the guitar strings that are played by the picking hand. We design an embedded system for accurate sensing and propose a data analysis mechanism to precisely figure out the played pitch and the duration of notes using the sensed data. We realize our scheme as a high-quality prototype that detects guitarist performance with accuracy sufficient for the transcribing of a performance. We also present real application examples such as a rhythm game for interactive lessons and a music sharing feature with user created musical scores.

    • Loominary: Crafting Tangible Artifacts from Player Narrative

      Anne Sullivan, University of Central Florida
      Joshua McCoy, UC Davis
      Sarah Hendricks
      Brittany Williams

      While game narrative provides a story for the player to experience, the moment-to-moment decisions made by the player are just as important to the experience. These decisions make up a personal narrative that the player creates through their choices and actions within the game. These stories describe the player’s experience, and are the stories that often get shared and retold by the player. However, these narratives are rarely captured by the game and instead rely on the player to memorize and retell them. In response to this, we designed Loominary, a game platform that plays Twine games using a rigid heddle table-top loom as a controller. Not only does this provide a new method of interacting with a game, but also records each of the player’s choices into a tangible object that is created by interacting with the game. Loominary is a working prototype and in this paper, we discuss the design considerations and areas for improvement.

    • Interactive and Open-Ended Sensory Toys: Designing with Therapists and Children for Tangible and Visual Interaction

      Florian Güldenpfennig, TU Wien
      Peter Fikar, TU Wien
      Roman Ganhör, TU Wien

      Treating neurological conditions like cerebral visual impairment (CVI) and related disabilities is a complex challenge where the needs of the affected persons have to be considered individually. It is also commonly agreed that stimulating the body’s senses, as part of early intervention programs, is a crucial activity in therapy. With this paper, we add to the literature on how tangible and embodied interaction can facilitate such stimulation of the body and provide engaging experiences for children with (multiple) disabilities. Our report entails a detailed description of a co-design process involving early intervention specialists and affected children over the course of six months and multiple prototype iterations. According to our participants, the strengths of the resulting products or therapeutic toys are their open-endedness and versatile applicability, meeting individual needs and making therapeutic sessions both enriching and fun for the children.

    • MOD: A Portable Instrument for Mixing Analog and Digital Drawing for Live Cinema

      Ali Momeni, Carnegie Mellon University

      This paper describes the design and fabrication of MOD (Mo- bile Object for Drawing)–a portable instrument for combining analog and digital drawing. MOD is intended for live performance and content creation efforts that mix common analog drawing interfaces (i.e. paper, transparency, pencil, marker) with digital cameras (webcams, scientific imaging cameras, digital magnifiers and microscopes), custom software (for keying, thresholding, looping, layer) and digital projectors. The iteration of the instrument described here combines all of these components into a single portable battery powered pack- age that embeds the computation on a small linux computer, includes a small laser projector, and integrates custom tac- tile controllers. The intended uses of this instrument include experimental performance and rapid content creation; the instrument is intended to be suitable for formal (concert hall, theater) and informal (street performance, busking, parade, protest) settings, classrooms and maker spaces.

    • SWAY – Designing for Balance and Posture Awareness

      Simon Asplund, Royal Institute of Technology / KTH
      Martin Jonsson, Södertörn University

      This paper presents the SWAY prototype that encourages people to explore aspects around balance and posture in a playful way. The prototype senses small movements and shifts in posture using a Kinect sensor, and maps these movements to the tilting of a platform holding a set of marbles, and to haptic feedback in the form of vibrations. The prototype provides an interactive experience focusing on building body awareness with a particular focus on balance and posture. The design inquiry provided new insights with respect to reinforcement of bodily experiences and how different modalities affect the guiding of attention.

    • Hard and Soft Tangibles: Mixing Multi-touch and Tangible Interaction in Scientific Poster Scenarios

      Alexandre Gomes de Siqueira, Clemson University, Louisiana State University
      Brygg Ullmer, Clemson University, Louisiana State University
      Chris Branton, Louisiana State University, Drury University
      Mark Delarosa, Louisiana State University, UC Irvine
      Miriam Konkel, Clemson University

      Research on tangible user interfaces commonly focuses on tangible interfaces acting alone or in comparison with multi-touch or graphical interfaces. In contrast, hybrid approaches can be seen as the norm for established “mainstream” interaction paradigms. This paper proposes interfaces that support complementary use of physical and virtual interaction modalities with tangible and multi-touch interactions. We describe prototypes involving capacitively-sensed surfaces combined with laser-cut and 3D-printed dial-like tangibles. We employ them in the context of several computationally-mediated scientific poster scenarios, which we argue to have a number of attractions for such deployments. We present two platforms: one templated, the other highly customizable, supporting both tangible and multi-touch interactions across horizontal and vertical displays. We explore users’ interaction modality choice among the options presented, draw lessons and challenges from posters developed by students, and consider future directions.

  • 15.00 – Work in Progress Session 2

      Time: 15.00 – 16.30, Location: Hyllan

    • Augmented Metacognition: Exploring Pupil Dilation Sonification to Elicit Metacognitive Awareness

      Alwin de Rooij, Tilburg University
      Hanna Schraffenberger, Tilburg University
      Mathijs Bontje, Tilburg University

      Metacognitive awareness enables people to make conscious decisions about their own cognitions, and adapt to meet task performance goals. Despite the role of metacognition in task performance, technologies that effectively augment metacognition are scarce. We explore a novel approach to augment metacognition based on making the eye’s pupil dilations, which associate with a variety of cognitions, audible via sonification in real-time. In this exploratory study, we investigated whether pupil dilation sonification can elicit metacognitive awareness. Our findings suggest that correlations between a variety of cognitions, e.g., attentional focus and depth of thinking, and sounds generated by the sonification can emerge spontaneously and by instruction. This justifies further research into the use of pupil dilation sonification as a means to augment metacognitive abilities.

    • Designing Bimanual Tangible Interaction for Stroke Survivors

      Mikko Kytö, Aalto University
      David McGookin, Aalto University
      Wilfried Bock, Aalto University
      Héctor Caltenco, Lund University
      Charlotte Magnusson, Lund University

      Stroke is a significant cause of long-term disability, impairing over 10 million peoples motor function, primarily on one side of the body every year. Whilst effective rehabilitation exercises can help recover and maintain some affected motor function, stroke survivors often do not carry out enough of these. Instead relying on their `good’ side to carry out tasks. However, this leads to poor recovery limiting the ability to carry out everyday bimanual tasks (such as dressing or cooking). We present work that seeks to support stroke survivors to engage in bimanual rehabilitation through interaction with augmented tangible objects that can be used to control everyday devices. Through a user-centered design process, we uncovered how bimanual rehabilitation can be supported. This led to the development of the ActivSticks device that allows bimanual rehabilitation and interaction with other devices and services.

    • MacroScope: First-Person Perspective in Physical Scale Models

      Dorothé Smit, University of Salzburg
      Martin Murer, University of Salzburg
      Thomas Grah, University of Salzburg
      Vincent van Rheden, University of Salzburg
      Manfred Tscheligi, University of Salzburg & AIT

      Traditionally, architects and designers have used scale models to explore, communicate, and evaluate concepts and ideas in large-scale, spatial projects. These scale models offer the user a bird’s eye perspective and often tangible ways of interacting with the model. Nevertheless, they lack a realistic, first-person view on the effects that the design solutions have in the space. In this paper, we explore MacroScope, a tool that aims to support collaborative spatial design by providing a real time, 360 degree first-person perspective in a physical scale model, by means of a virtual reality head-mounted display. We reflect on the usage potentials in collaborative creative processes, before describing next steps for the development of MacroScope.

    • SENSE-SEAT: Challenging Disruptions in Shared Workspaces Through a Sensor-Based Seat

      Nils Ehrenberg, M-ITI
      José Luís Silva, ISCTE – Instituto Universitário de Lisboa, ISTAR-IUL, Madeira-ITI
      Pedro Campos, Madeira-ITI, Universidade da Madeira

      Creative industries’ workers are becoming more prominent as countries move towards intellectual-based economies. Consequently, the workplace needs to be reconfigured so that creativity and productivity can be better promoted at shared workspaces. We report on a study based on diaries, interviews and probes, with 8 creative industries’ professionals at a co-working space, with the goal of understanding their advantages and disadvantages, and causes for cognitive disruptions. Findings indicate that temperature, noise and coworkers’ requests are the main causes for disruptions in the work processes. The insights are used to inform the design process of SENSE-SEAT, a seat with embedded sensors and tangible actuators, as a contribution to reimagining the role of tangible and embedded interaction in intelligent furniture. We are currently at a prototyping stage, with 3D prints and 3D renders and we explain the design process and outlining the early results.

    • EMS Painter: Co-creating Visual Art using Electrical Muscle Stimulation

      Ashley Colley, University of Lapland
      Aki Leinonen, University of Lapland
      Meri-Tuulia Forsman, University of Lapland
      Jonna Häkkilä, University of Lapland

      We present a work in progress, enabling an audience to influence a painter, through electrical muscle stimulation (EMS) to co-create visual art works. Two sets of EMS pads are positioned on the painter’s arm such that stimulation pulses, triggered on a collocated tablet interface by audience members, cause deviation in the ongoing brushstroke. LEDs co-located with the EMS pads provide additional visual feedback to the audience of stimulation delivery. In an initial trial, rather than focusing on co-creation, the audience amused themselves by mischievously diverting the painter from his work.

    • Mobile, Exercise-agnostic, Sensor-based Serious Games for Physical Rehabilitation at Home

      Ana Vasconcelos, Fraunhofer Portugal AICOS
      Francisco Nunes, Fraunhofer Portugal AICOS
      Alberto Carvalho, Fraunhofer Portugal AICOS
      Catarina Correia, Fraunhofer Portugal AICOS

      Serious games can improve the physical rehabilitation of patients with different conditions. By monitoring exercises and offering feedback, serious games promote the correct execution of exercises outside the clinic. Nevertheless, existing serious games are limited to specific exercises, which reduces their practical impact. This paper describes the design of three exercise-agnostic games, that can be used for a multitude of rehabilitation scenarios. The developed games are displayed on a smartphone and are controlled by a wearable device, containing inertial and electromyography sensors. Results from a preliminary evaluation with 10 users are discussed, together with plans for future work.

    • Feeling Virtual Worlds: An Exploration into Coupling Virtual and Kinaesthetic Experiences

      Joey Campbell, University of Bristol
      Trevor Hogan, Cork Institute of Technology
      Mike Fraser, University of Bristol

      In this paper we describe an exploratory study that incorporates the design, implementation and study of a system that utilises virtual reality, tangible interaction and force feedback . The approach we take is to design a VR system that incorporates a moveable tangible interface (wheelchair), which overlaps seamlessly with a 3d counterpart in the virtual world. The user interacts with the virtual environment by pushing the physical wheelchair, which simultaneously controls the virtual avatar. In the virtual world we place objects that once collided with trigger force feedback by stopping the physical wheelchair. In this paper we discuss the design rationale and technical implementation and follow by describing the next phase of this work in progress.

    • Your Body of Water: A Display that Visualizes Aesthetic Heart Rate Data from a 3D Camera

      Lee Jones, Carleton University
      Paula Gardner
      Nick Puckett

      We present the design and implementation of Your Body of Water, a display that wirelessly gathers heart rate data using a 3D camera and then visualizes the viewer’s heart rate as water. As heart rate goes up the water gets livelier (with larger and faster waves) and as heart rate goes down the water gets calmer. The purpose of the display is to use aesthetic biofeedback data to help participants reflect on their felt bodily experience. The device went through system critique using somaesthetic appreciation design heuristics, and we describe the design themes that arose from those critiques.

    • Charged Utopia VR: Exploring Embodied Sense-making in the Virtual Space

      Rosa van der Veen, RISE Interactive Umeå, Eindhoven University of Technology
      Jeroen Peeters, RISE Interactive
      Ambra Trotto, Umeå University, RISE Interactive

      This paper reports on preliminary results of a design research project that explores how spaces in virtual reality may be designed to build on qualities of embodied sensemaking. The project forms a basis for the exploration of an ethical dimension to interactions in virtual reality. This publication focuses on identifying qualities of embodied sense-making in an existing physical space, the interactive exhibition Charged Utopia. These qualities are transposed into a virtual interactive space. The translation of the qualities is done through the three main themes: Physical Movement, Resistance and Ambiguity. We present the design research process to describe how these themes were identified and transposed. We conclude with reflections that sketch ways in which we might capitalise on the opportunities offered by a virtual space, while respecting human skills in embodied sensemaking.

    • Designing the Interaction with the Internet of Tangible Things: a Card Set

      Leonardo Angelini, University of Applied Sciences and Arts Western Switzerland (HES-SO)
      Elena Mugellini, University of Applied Sciences and arts Western Switzerland (HES-SO)
      Nadine Couture, ESTIA
      Omar Abou Khaled, University of Applied Sciences and Arts Western Switzerland (HES-SO)

      Current interactions for the Internet of Things are often constrained behind a screen. With the Internet of Tangible Things (IoTT) we aim at promoting the design of richer interactions, embodied in physical IoT objects. To this purpose, we propose a card set for the design of tangible interaction with IoT objects, which contains 8 cards for tangible interaction properties and 8 for IoT properties, in order to explore how tangible properties can be exploited for enhancing the interaction with IoT objects. We tested the card set in a dedicated workshop, observing that participants were able to explore most of the tangible and IoT properties. To complement the IoT card set, a hardware prototyping toolkit with examples for each of the 8 tangible properties is currently under development.

    • Sew-Flow: A Craft Interpretation of a Digital Game

      Kinneret Itzhak
      Kamila Kantek
      Dafna Levi
      Shoval Nir
      Shachar Geiger
      Michal Rinott

      We present “Sew-Flow”, a craft interpretation of the Digital Game “Flow Free” that is played on smartphones and computers. In Sew-Flow the screen is replaced by a fabric interface that is played by a process of sewing with conductive yarn. Feedback to the game play is presented by thermo-chromic colors that gradually appear. We lay out the motivations for slowing down the game experience and moving it into the material world. The crafting of the game elements and the game experience are described in detail, as well as the technological implementation. Insights from initial trials are described, showing that children and adults perceived the game in different ways. Future directions involve solving a number of challenges involving the thermo-chromic materials and conductive yarn

    • Full Body Interaction beyond Fun: Engaging Museum Visitors in Human-Data Interaction

      Swati Mishra, Indiana University – Purdue University
      Francesco Cafaro, Indiana University-Purdue University Indianapolis

      Engaging museum visitors in data exploration using full-body interaction is still a challenge. In this paper, we explore four strategies for providing entry-points to the interaction: instrumenting the floor; forcing collaboration; implementing multiple body movements to control the same effect; and, visualizing the visitors’ silhouette beside the data visualization. We discuss preliminary results of an in-situ study with 56 museum visitors at Discovery Place, and provide design recommendations for crafting engaging Human-Data Interaction experiences.

    • Prototyping and Simulating Complex Systems with Paper Craft and Augmented Reality: An Initial Investigation

      Seokbin Kang
      Leyla Norooz
      Virginia Byrne
      Tamara Clegg
      Jon Froehlich

      We present early work developing an Augmented Reality (AR) system that allows young children to design and experiment with complex systems (e.g., bicycle gears, human circulatory system). Our novel approach combines low-fidelity prototyping to help children represent creative ideas, AR visualization to scaffold iterative design, and virtual simulation to support personalized experiments. To evaluate our approach, we conducted an exploratory study with eight children (ages 8-11) using an initial prototype. Our findings demonstrate the viability of our approach, uncover usability challenges, and suggest opportunities for future work. We also distill additional design implications from a follow-up participatory design session with children.

    • Flyables: Exploring 3D Interaction Spaces for Levitating Tangibles

      Pascal Knierim, Human-Centered Ubiquitous Media
      Thomas Kosch, Human-Centered Ubiquitous Media
      Alexander Achberger, Institute for Visualization and Interactive Systems
      Markus Funk, TU Darmstadt

      Recent advances in technology and miniaturization allow the building of self-levitating tangible interfaces. This includes flying tangibles, which extend the mid-air interaction space from 2D to 3D. While a number of theoretical concepts about interaction with levitating tangibles were previously investigated by various researchers, a user-centered evaluation of the presented interaction modalities has attracted only minor attention from prior research. We present Flyables, a system adjusting flying tangibles in 3D space to enable interaction between users and levitating tangibles. Interaction concepts were evaluated in a user study(N=17), using quadcopters as operable levitating tangibles. Three different interaction modalities are evaluated to collect quantitative data and qualitative feedback. Our findings show preferred user interaction modalities using Flyables. We conclude our work with a discussion and future research within the domain of human-drone interaction.

    • TUIst: A Collaborative and Computationally Enhanced Game Board

      Trisha Garcia, Wellesley College
      Shannon Brown, Wellesley College
      Siona Dev, Wellesley College

      TUIst is an interactive game board that hopes to improve collaboration and team building. Based on the familiar and popular game of Twister, TUIst is an interface of multiple circles that have lights and sounds mapped to the sensors to create an audible and visual reaction when an object (most likely a foot or hand) is placed on the corresponding circle. The game controller will allow for audience interception and participation–they will be able to change the rules of how the game is played (for example, making players place their right hand on any red circle) which will increase the difficulty of the game. Our hope is to have this board in popular and busy spaces such as campus centers. Our goal with this surface is to create an interface that will help foster engagement in communities.

    • Crimson Wave: Shining a Light on Menstrual Health

      Margaret Flemings, Wellesley College
      Shanzay Kazmi, Wellesley College
      Rachel Pak, Wellesley College
      Orit Shaer, Wellesley College

      Crimson Wave is a sensor-based wearable that generates information on its user’s menstrual cycle. Basal Body Temperature (BBT) is an individual’s resting temperature, which fluctuates slightly depending on the individual’s current menstrual stage. Crimson Wave tracks the user’s BBT through the wearable and then visually displays the data on a separate smart mirror. Specifically, the mirror lights up a corresponding color depending on where the user is in their menstrual cycle. Crimson Wave offers a novel method of keeping track of one’s health and integrating menstrual data seamlessly into daily life. The combination of live and aggregated personalized data helps optimize the day-to-day lives of those experiencing menstrual cycles. This project is helpful for people who are interested in being more informed about their cycles, especially when it is irregular. In this paper, we describe the concept behind Crimson Wave, as well as its implementation and iteration.

  • 18.30 – Art Reception & Conference Dinner

      Time: 18.30 – 23.00, Location: Kulturhuset

    • Digitally Enchanted Wear: a Novel Approach in the Field of Dresses as Dynamic Digital Displays

      Rebecca Kleinberger, MIT
      Alisha Panjwani

      We introduce the term Digital Dresses as Dynamic Displays as an emergent field in the domains of Wearable Computing and Embodied Interactions. This recent approach consists of turning clothing into visual – and sometimes audiovisual – displays to enable novel forms of interaction between the wearer, the viewer, the tangible clothing and the embedded content. In this context, we present Enchanted Wearable, a new optimized low cost approach to create Digital Dresses as Dynamic Displays. Enchanted Wearable is a technologically embellished and augmented garment containing a portable rear dome projection system that transforms the clothing fabric into a blank canvas displaying audiovisual content. With this system, we create a new form of expression through clothing to reflect identity, personality, emotions and inner states of the wearer. In this paper we first present the growing field of Digital Dresses as Dynamic Displays, then we survey and analyse existing prior art in this field using a specific list of characteristics: display technology, wearability, interactivity, brightness, context. Finally we present the design and technology behind our new Enchanted Wearable system and explain how it brings new perspectives to the field.

    • [pain]Byte: Chronic Pain And Biomedical Engineering Through The Lens Of VR

      Genevieve Smith-Nunes, ReadySaltedCode CIC, University of Roehampton
      Camilla Neale, University of Worcester
      Alex Shaw, GlastonBridge Software

      [pain]Byte is looks at the world of chronic pain. The invisible disability of spinal chronic pain which is manifested and represented through data driven dance (classical ballet) and virtual reality (VR). Enabling the non sufferer audience to ‘see’ the hidden nature and challenges of chronic pain linked to the benefits of biomedical engineering and implanted technology. The body as analogue represented through the digital of the wearables and the virtual in the VR experience. Humanising implanted technology and exposing the invisible nature of chronic pain for audiences. In our exhibit, people can watch the VR, interact with the biometric sensors and our single Kinect motion capture. A recording of the ballet will be projected.

    • The Cloakroom

      Ella Dagan, University of California Santa Cruz, New York University

      This​ ​paper​ ​presents​ ​the​ ​concept​ ​and​ ​instances​ ​of implementation​ ​of​ ​a​ ​documentary​ ​embodied​ ​art installation​ ​named​ ​’The​ ​Cloakroom’.​ ​The​ ​Cloakroom​ ​is an​ ​interactive​ ​aesthetic​ ​experience​ ​which​ ​is​ ​made​ ​out of​ ​multiple​ ​interpersonal​ ​relationship​ ​stories​ ​and​ ​their connection​ ​to​ ​objects.​ ​People​ ​are​ ​invited​ ​to​ ​embody​ ​a relationship​ ​by​ ​literally​ ​donning​ ​a​ ​jacket​ ​and​ ​going through​ ​the​ ​motions​ ​of​ ​finding​ ​things​ ​in​ ​its​ ​pockets. The​ ​objects​ ​they​ ​find​ ​are​ ​then​ ​used​ ​as​ ​triggers​ ​to​ ​play pre-recorded​ ​stories,​ ​bringing​ ​analogue​ ​artifacts​ ​to meet​ ​the​ ​digital​ ​content.​ ​The​ ​use​ ​of​ ​pockets​ ​highlights the​ ​physical​ ​intersection​ ​between​ ​tangibles​ ​and wearables.

    • Nettle: An Exploration of Communication Interface Design for Older Adults

      Audrey Fox, Parsons School of Design

      Telecommunication with family and friends is often offered as a solution for aging adults facing social isolation. While strengthening existing ties is important, it fails to address the importance of spontaneous community interactions. This paper presents Nettle, a system that is designed to build casual human connection into one’s daily routine. Nettle is based on the artist’s alternate vision of smart home design where interfaces are playful, based on familiar household forms and warmly inviting. The audience will observe a performance where Nettle fosters a spontaneous spoken conversation for an older woman alongside the process of her making a pot of tea.

    • The Shared Individual

      Asreen Rostami, Stockholm University
      Emma Bexell, Bombina Bombast
      Stefan Stanisic, Bombina Bombast

      The Shared Individual is a live collaborative Mixed-Reality Performance in which, a group of audience members can observe themselves through an individual’s point of view. In this performance, a performer shares her view with audience members by wearing a head-mounted camera and steaming live video. By wearing a head-mounted display audience members can see themselves and follow performer’s instruction to “occupy” her body and become her. This instruction, in the form of performance, is designed to help the audience to sync with the performer in three different stages: visual synchronization, physical synchronization and emotional synchronization.

    • Au Clair de la Lune on Gramophone —For Eduard-Leon Scott and Laszlo Moholy-Nagy— (1860 / 1923 / 2015)

      Kazuhiro Jo, YCAM, Kyushu University

      “Au Clair de la Lune on Gramophone – For Édouard-Léon Scott and László Moholy-Nagy -(1860/1923/2015)” is a work which realizes the provocative idea of Moholy-Nagy, “a record without prior acoustic information” (1923) with the help of mature vinyl audio recording technology and current personal fabrication tools. This paper explains the body of the work as well as its background and characteristic with references.

    • Objektivisering: Text Physicalization and Self-introspective Post-digital Objecthood

      Marinos Koutsomichalis, Norwegian University of Science and Technology

      Objektivisering is an experimental system for the computational modeling of generative 3D-printable models. The system pivots on Natural Language Understanding technologies and big 3D data. It processes arbitrary text input, eventually generating an array of words and phrases that summarise its meaning. These are then used as queries to retrieve 3D data from Thingiverse. Finally, the resulting bag of models is concatenated, in this way producing original 3D-printable designs in a computational fashion. Albeit being concretely physical, the resulting artefacts also incorporate the cybernetic encodings of their own making. In this way, they celebrate a certain kind of objecthood which is hybrid, post-digital, and self-introspective at a structural level, in this way concretely accelerating materialist aesthetics.

    • Deep Wear: a case study of collaborative design between Human and Artificial Intelligence

      Natsumi Kato, University of Tsukuba
      Hiroyuki Osone, University of Tsukuba
      Daitetsu Sato, University of Tsukuba
      Naoya Muramatsu, University of Tsukuba
      Yoichi Ochiai, University of Tsukuba

      Deep neural network (DNNs) applications are now increasingly pervasive and powerful. However, fashion designers are lagging behind in leveraging this increasingly common technology. DNNs are not yet a standard part of fashion de- sign practice, either clothes patterns or prototyping tools. In this paper, we present DeepWear, a method using deep convolutional generative adversarial networks for clothes design. The DNNs learn the feature of specific brand clothes and generate images then patterns instructed from the images are made, and an author creates clothes based on that. We evaluated this system by evaluating the credibility of the actual sold clothes on market with our clothes. As the result, we found it is possible to make clothes look like actual products from the generated images. Our findings have implications for collaborative design between machine and human intelligence.

    • Op 1254: Music for Neutrons, Networks and Solenoids using a Restored Organ in a Nuclear Reactor

      Leif Handberg, KTH Royal Institute of Technology
      Ludvig Elblaus, KTH Royal Institute of Technology
      Chris Chafe, Stanford University

      In this paper, an installation is presented that connects Stanford and Stockholm through a one-of-a-kind combination of instrument and venue: the Skandia Wurlitzer theatre organ (Wurlitzer serial no.1254) situated in the KTH R1 Experimental Performance Space, a disused nuclear reactor. A continuous stream of musical data, audio, and video between the two places explored the capabilities of the digital to play with the concept of presence and embodiment, virtuality and the physical. In the installation, a series of performances presented new pieces written especially for this setting. The pieces were performed by musicians in Stanford, mediated in real-time, allowing them to play together with the theatre organ in Stockholm, temporarily fusing the two venues to create one ensemble, one audience, in one space.

    • Eidolon360 – A VR experience

      Beverley Hood, University of Edinburgh
      Tom Flint, Edinburgh Napier University

      Eidolon360 is a virtual reality artwork and experience that is interacted with through VR headsets. The viewer, reclining on a bed within the exhibition space, experiences a 360 film, shot within a medical simulation centre, that mimics clinical hospital locations, such as operating theatres and hospital wards. The reclining viewer inhabits the point of view of resuscitation manikin Resusci Anne, set within a resuscitation training room. A medic (actress Pauline Goldsmith) approaches Resusci Anne and tenderly recounts her origin story, an intriguing tale of a mysterious drowned young woman, found in Paris in the late 1880’s, who became the face of CPR (cardiopulmonary resuscitation), Resusci Anne, and has since been revived by over 300 million people worldwide. The film attempts to present an emotionally resonant anecdote, as an immersive experience, scrutinizing the overlaps between real life and simulation.

    • Embodisuit: A Wearable Platform for Embodied Knowledge

      Sophia Brueckner, University of Michigan
      Rachel Freire, Rachel Freire Studio

      The Embodisuit allows its wearer to map signals onto different places on their body. Informed by embodied cognition, the suit receives signals from an IoT platform, and each signal controls a different haptic actuator on the body. Knowledge is experienced ambiently without necessitating the interpretation of symbols by the conscious mind. The suit empowers wearers to reconfigure the boundaries of their selves strengthening their connection to the people, places, and things that are meaningful to them. It both critiques and offers an alternative to current trends in wearable technology. Most wearables harvest data from their users to be sent and processed elsewhere. The Embodisuit flips this paradigm such that data is taken in through the body instead. Furthermore, we believe that by changing the way people live with data, it will change the type of data that people create.

    • The Bronze Key: Performing Data Encryption

      Susan Kozel, Malmö University
      Ruth Gibson, Coventry University
      Bruno Martelli, Independent Artist

      The Bronze Key art installation is the result of performative re-materialisations of bodily data. This collaborative experiment in data encryption expands research into practices of archiving and critical discourses around open data. It integrates bodily movement, motion capture and Virtual Reality (VR) with a critical awareness of data trails and data protection. A symmetric cryptosystem was enacted producing a post-digital cipher system, along with archival artefacts of the encryption process. Material components for inclusion in the TEI Arts Track include: an audio file of text to speech of the raw motion capture data from the original movement sequence on cassette tape (The Plaintext), a 3D printed bronze shape produced from a motion captured gesture (The Encryption Key), and a printed book containing the scrambled motion capture data (The Ciphertext).

    • Tracking, Animating, and 3D Printing Elements of the Fine Arts Freehand Drawing Process

      Piyum Fernando, SANDS Group at the School of Arts, Media and Engineering
      Jennifer Weiler, Arizona State University
      Stacey Kuznetsov, Arizona State University
      Pavan Turaga, Arizona State University

      Dynamic elements of traditional drawing processes such as the order of compilation, and speed, length, and pressure of strokes can be as important as the final art piece because they can reveal the technique, process, and emotions of the artist. In this paper, we present an interactive system that unobtrusively tracks the freehand drawing process (movement and pressure of artist’s pencil) on a regular easel. The system outputs captured information using 2D video renderings and 3D-printed sculptures. We also present a summery of findings from a user study with 6 experienced artists who created multiple pencil drawings using our system. The resulting digital and physical outputs from our system revealed vast differences in drawing speeds, styles, and techniques. At TEI art track, the attendees will likely engage in lively discussion around the analog, digital, and tangible aspects of our exhibit. We believe that such a discussion will be critical not only in shaping the future of our work, but also in understanding novel research directions at the intersection of art and computation.

    • The Screaming Sun

      Mithru Vigneshwara, New York University

      Digital devices usually contain pre-programmed constraints and behaviours. These behaviours are programmed by the architect of these devices. Analogue devices, on the other hand, are bound by the properties of the components connected to them. Controlling these components and external physical properties will in-turn lead to controlling the output of these devices. Many devices are built to perceive physical phenomena and react to certain stimuli. The Screaming Sun is an analogue noise-making instrument that experiments with light as a stimulus. It is a solar-powered interactive artifact that is meant to be played with and explored. The instrument, much like a living being, thrives on light. The performer plays the instrument by actually influencing the physical properties of light that shines on the instrument, and not by pushing buttons or turning knobs.

    • Twinkle: A Flying Lighting Companion for Urban Safety

      Honghao Deng, Harvard University
      Jiabao Li, Harvard University
      Allen Sayegh, Harvard University
      Sebastian Birolini, University of Bergamo
      Stefano Andreani, Harvard University

      The current city lighting system that leaves many areas uncovered induces unsafe perceptions and instigates crimes. The addition of ubiquitous surveillance is an intrusion on privacy and does not take real-time actions. The cold, lifeless light shines in the darkness, trapping people in the solitude of silence. These absences motivated us to create Twinkle – a luminous transformative creature inhabits on light posts. They are curious aerial animals attracted by human activities. During daytime, they rest on urban light posts, expanding their solar panels for charging. At night, they interact with individuals walking on the street in their own way based on their distinct personalities. Twinkles are indirect solutions for improving urban safety without surveillance. We envisage a future that appliance goes beyond machine and becomes a companion with us.

    • Synesthetic Experience in STRATIC

      Vygandas Simbelis, KTH – Royal Institute of Technology
      Anders Lundström, KTH

      How do we humanize digital interactive technology? One way is through our experience with technology. With S T R A T I C we present several post-digital concepts to discuss the relationship of the digital in regard to our human lives. We emphasize the synesthetic experience along with other aesthetic experiences and materiality issues with manifestations of the digital in the physical world, tangible approaches to sonic performances, or exposure of internal logics of technological processes.
      In this paper, we propose both exhibiting our work as an art installation and via a live performance. We regard it as being highly relevant to the topic of the TEI Arts Track exhibition: post-digital materiality at the intersection of the analog and the digital, and to its tangible aspects.

    • Thou and I: Exploring Expressive Digital Interaction with Interactive Characteristic Wigs

      Young Suk Lee, Indiana University

      My project involves the construction of a series of experimental wearable digital technologies as expressive wigs. These wigs are designed to explore the interactions of the wearers and the spectators as the wig moves in elaborate ways to highlight the realms of aesthetic value and sensory experience. The participants are provoked, they are engaged, they are compelled into a state of awareness and curiosity while the digital object constitutes novel experiences that coincide with anthropomorphic interactivity. Furthermore, the expressive value of a digital object is turned inwards as it becomes an extension of the self and a source of multi-sensory communication. Ultimately, the digital object seeks to open a social dialogue on the aesthetic purpose of technology by encouraging people to extend their imagination through bodily interaction. Here the digital object enables the wearers to create engaging narratives and provocative expressions.

    • Beacon: Exploring Physicality in Digital Performance

      Anna Weisling, Georgia Institute of Technology
      Anna Xambó, Queen Mary University of London

      Live performances which involve digital technology often strive toward clear correspondences between distinct media modes, particularly those works which combine audio and video. Often, the process of creating and executing such performances involves mapping schemes which are encased within the digital system, producing content which is tightly synchronized but with relationships which can feel rigid and unexpressive. Within this paper we present a collaborative process between visualist and musician, which builds toward a method for promoting co-creativity in multimedia performance and prioritizes the performer’s physical presence and interaction with digital content. Through the development of two autonomous systems, a novel physical interface and an interactive music system, we summarize our creative process of co-exploration of system capabilities, and extended periods of experimentation and exploration. From this experience, we offer an early-stage framework for approaching engaging digital audiovisual relationships in live performance settings.

    • Marching Cubes Made Tangible

      Jesse Jackson, University of California, Irvine

      Drawing inspiration from a computer algorithm of the same name, Marching Cubes Made Tangible leverages 3D printing to make the virtual world physical. In the 1980s, researchers devised an algorithm for generating computer graphics from medical scan data that featured an underlying language of faceted cubes. Marching Cubes Made Tangible translates this virtual procedure into interactive installations, which are assembled from a modular set of 3D printed components. By enacting the algorithm in the real world, this project generates dialogue about the ways in which information technologies create the building blocks of contemporary culture.

    • SKIN – Embodied Navigation through WiFi Traffic using Vibro-Tactile Feedback

      Bas van den Boogaard
      Vibeke Thorhauge Stephensen
      Louise Ørsted Jensen
      Karina Lindegaard Aae Jensen
      Stefan Engelbrecht Nielsen
      Markus Löchtefeld

      In today’s world, radio waves are an integral element of our daily lives but they are mostly hidden to human perception. In this paper we present SKIN, a wireless wearable system that creates a new interface between the human and WiFi signals employing sensory augmentation. SKIN translates nearby WiFi data transmissions into a vibrotactile experience to alter the perception of the space around the human body. It is designed as a tool for examining how an embodied experience of the invisible waves affects our relation to both the physical environment as well as to the technology emitting the insensible dimension. We present the design and implementation as well as the results of two exhibitions in which SKIN was worn by 17 participants that demonstrate how SKIN helps to interpret and alters the perception of the built environment.

    • I Want To: Interactive Installation for Understanding our Desire

      Laewoo Kang, Cornell University

      ‘I Want To’ is an interactive installation controlled by live Twitter messages. One hundred custom designed wooden toys, a television screen and speakers comprise the installation. The system extracts public Twitter messages that start with ‘I want to.’ The expression ‘want to’ becomes ‘have to’, and the newly composed sentence is displayed on the television screen while also being vocalized through speakers. With each “I have to” phrase, the wooden toys respond by marching in unison. This installation gives the audience an opportunity to explore our hopes and desires as unconscious internalizations of external expectations and social norms.

    • Rewilding Wearables – Sympoeitic Interfaces for Empathic Experience of Other-than-human Entities

      Patricia Flanagan
      Raune Frankjaer

      This project involves a series of walks in rewilded environments mediated by a wearable interface, that enables the interlocutor to perceive the environment from an alien perspective. The aim is to foster empathy for other-than-human entities and promulgate holistic and biodiverse ecologies. Technocrafting the prosthetic device from organic and electronic materials by blending traditional with digital techniques, create devices that the authors term ‘cyborganic’. The focus of this paper is a device that sits as if grafted around the human head, and appears to come to life embodied with its own sense of ‘agency’. This paper describes the 1st and 2nd generation prototype of this devices, and its current configuration as an aid for empathetic experience of insects in rewilded spaces. To conclude we describe a testing methodology developed in Aarhus based on a series of walks with users where they engage in semi-structured interviews post-walk to evaluate their experience.

    • Code{strata} Sonifying Software Complexity

      Denez Thomas
      Nicolas Harrand
      Bruno Bossis
      Benoit Baudry

      Code{strata} is an interdisciplinary collaboration between art studies researchers (Rennes 2) and computer scientists (INRIA, KTH). It is a sound installation: a computer system unit made of concrete that sits on a wooden desk. The purpose of this project is to question the opacity and simplicity of high-level interfaces used in daily gestures. It takes the form of a 3-D sonification of a full software trace that is collected when performing a copy and paste command in a simple text editor. The user may hear, through headphones, a poetic interpretation of what happens in a computer, behind the of graphical interfaces. The sentence “Copy and paste” is played back in as many pieces as there are nested functions called during the execution of the command.

    • Lets Fake News

      Léon McCarthy

      ‘Let’s Fake News’ is an interactive media art installation that forces participants to realize that anyone can create fake news and may even find joy in doing so. The artwork addresses the conference themes as an interactive installation that challenges ideas around ‘post-truth’, creating an experience that engages with digital representations of the discursive interactions.

Wed, 21 March

All Wednesday program events except for the Arts Track Performance will take place at Nymble at KTH Royal Institute of Technology.
For a map of individual rooms and accessibility access, please visit the Venue page.
The Arts Track Performance will take place at KTH main campus in KTH R1 Experimental Performance Space (Drottning Kristinas väg 51) – the Reactor Hall.
For directions, please visit the TEI 2018 Google Map.

  • 8.30 – Registration

    Time: 8.30 – 9.00, Location: Puben

  • 9.00 – Coffee

    Time: 9.00 – 9.30, Location: Puben

  • 9.30 – TEI Open Panel

    Time: 9.30 – 10.30, Location: Hyllan

  • 10.30 – Session 5: Evaluating Tangible Interactions

      Time: 10.30 – 12.00, Location: Main Hall

      Michal Rinott

    • Evaluating Learning with Tangible and Virtual Representations of Archaeological Artifacts

      Christina Pollalis, Wellesley College
      Elizabeth Minor, Wellesley College
      Lauren Westendorf, Wellesley College
      Whitney Fahnbulleh, Wellesley College
      Isabella Virgilio, Wellesley College
      Andrew Kun, University of New Hampshire
      Orit Shaer, Wellesley College

      Digital technological advances offer new methods of representing physical objects in tangible and virtual forms. This study compares the learning outcomes from 61 students as they interact with a set of ancient Egyptian sculptures using three increasingly popular educational technologies: HoloLens AR headset, 3D model viewing website (SketchFab), and plastic extrusion 3D prints. We explored how differences in interaction styles affect the learning process, quantitative and qualitative learning outcomes, and critical analysis.

    • The Impact of Tangible Props on Gaming Performance and Experience in Gestural Interaction

      Daniel Reinhardt, Julius-Maximilians-Universität Würzburg
      Jörn Hurtienne, Julius-Maximilians-Universität

      There is a widely-held belief in the TEI and games communities that using tangible props enhances the player performance and gaming experience. Building on prior research in gaming, we studied people playing a video game using a gestural interface without props, with an incomplete or a complete tangible prop. The interaction with the tangibles was judged to be more natural, led to higher flow experience and was preferred, but there were no differences regarding performance, enjoyment, autonomy, presence and mental effort. There was a tendency that especially the incomplete tangible prop was judged as less intuitive and people felt less competent in gameplay. In contrast to prior research, the results show that tangible interaction is not always more beneficial than gestural interaction and that, especially for incomplete tangibles, any beneficial effects may be domain-dependent.

    • Multimodal Effects of Color and Haptics on Intuitive Interaction with Tangible User Interfaces

      Diana Löffler, University of Würzburg
      Robert Tscharn, University of Würzburg
      Jörn Hurtienne, Julius-Maximilians-Universität

      Tangible User Interfaces (TUIs) allow users to sense and manipulate digital information through physical objects. Although haptic properties are emphasized, TUIs are presented in and perceived through multiple modalities. Especially visual properties like color shape the users’ expectations about the relation between tangibles and the abstract data they represent and control. Viewing TUIs as multisensory percepts, we present an empirical study that quantifies benefits of an explicit design for color for intuitive interaction. In a cross-cultural experiment, 75 participants (Germans and Japanese) matched tangible objects of different colors, sizes, weights or temperatures with abstract words. The results indicate that multimodal representations increase the efficiency, effectiveness and user satisfaction, but only if colors and haptic properties evoke congruent associations. Incongruently designed tangibles led to a 17% increase in response speed, -28% lower accuracy and -61% lower preference ratings compared to congruently designed tangibles.

    • Being in the Sky: Framing Tangible and Embodied Interaction for Future Airliner Cockpits

      Catherine Letondal, University of Toulouse – ENAC
      Jean-Luc Vinot, University of Toulouse, ENAC
      Sylvain Pauchet, University of Toulouse – ENAC, Astrolab
      Caroline Boussiron, Ingenuity i/o
      Stéphanie Rey, University of Toulouse – ENAC, Berger-Levrault
      Valentin Becquet, University of Toulouse – ENAC
      Claire Lavenir, Intactile Design

      In order to contribute to a design for future airline cockpits that can address the limitations of touch-based interfaces, we analyze tangible dimensions of cockpit activity based on observations and pilot interviews. Working from these data, using TEI theory and concepts of phenomenology, we discuss the implications for tangible design of our findings. We found that the status of sensation in perception, the required level of control in actions, the safety issues using physical objects and the restricted mode of externalization, raise challenges for tangible design. Accordingly, we discuss key concepts for the design of the future cockpit, such as the use of a protected space where interaction may involve compressed externalization, rhythmic structures and customized context-aware adaptations.

  • 12.00 – Lunch

    Time: 12.00 – 13.30, Location: Matsalen

  • 13.30 – Session 6: Theory, Community, and Frameworks

      Time: 13.30 – 15.00, Location: Main Hall

      Jelle Saldien

    • Cohousing IoT: Design Prototyping for Community Life

      Tom Jenkins, Georgia Tech

      Cohousing IoT uses research through design to both probe alternative modes of living and prototype speculative domestic Internet of Things devices. These prototype technologies are informed by a public design process that works in two ways. First, it imagines alternatives to existing arrangements of things and devices in the home; and second, it produces prototypes that argue for new roles for internet-connected things that both support and sustain the social life of a cohousing community.

    • Through the Glance Mug: A Familiar Artefact to Support Opportunistic Search in Meetings

      Ahmet Börütecene, Koç University
      İdil Bostan, Koç University
      Ekin Akyürek, Koç University
      Alpay Sabuncuoğlu, Koç University
      İlker Temuzkuşu, Koç University
      Çağlar Genç, Koç University
      Tilbe Göksun, Koç University
      Oğuzhan Özcan, Koç University

      During collocated meetings, the spontaneous need for information, called opportunistic search, might arise while conversing. However, using smartphones to look up information might be disruptive, disrespectful or even embarrassing in social contexts. We propose an alternative instrument for this practice: Glance Mug, a touch-sensitive mug prototype that listens to the conversation and displays browsable content-driven results on its inner screen. We organized 15 pairs of one-to-one meetings between students to gather user reflections. The user study revealed that the mug has the potential for supporting instant search and affords sufficient subtlety to conceal user actions. Yet, it provoked some anxiety for the users in maintaining eye contact with their partners. Our main contributions are the context-aware mug concept tested in a real-life setting and the analysis through Hornecker and Buur’s Tangible Interaction Framework that discusses its design space, and its impact on the users and social interaction.

    • Focus Framework: Tracking Prototypes’ Back-Talk

      Markus Schilling, Simon Fraser University
      Ron Wakkary, Simon Fraser University, Eindhoven University of Technology
      William Odom, Simon Fraser University

      This paper presents an analytical approach that we call the focus framework. The framework aids the analysis of the intended and unintended design attributes that emerge within a project’s design process. The framework helps to reveal how prototypes and decision making interact together to shape the final design features and make visible the trajectory of central design attributes and unexplored alternatives. In this paper, we report on the framework and its development by way of a retrospective analysis of a tangible light installation we designed known as the Urban Data Posts. We see the potential for designers to use the focus framework as a post-mortem tool to retrospectively analyze their own work and thus inform their design practice. The knowledge gained through the analysis can then be applied in future projects more generatively.

    • Understanding Transformations through Design: Can Resilience Thinking Help?

      Rosa van der Veen, RISE Interactive Umeå, Eindhoven University of Technology
      Viola Hakkerainen, Stockholm Resilience Centre
      Jeroen Peeters, RISE Interactive
      Ambra Trotto, Umeå School of Architecture, RISE Interactive

      The interaction design community increasingly addresses how digital technologies may contribute to societal transformations. This paper aims at understanding transformation ignited by a particular constructive design research project. This transformation will be discussed and analysed using resilience thinking, an established approach within sustainability science. By creating a common language between these two disciplines, we start to identify what kind of transformation took place, what factors played a role in the transformation, and which transformative qualities played a role in creating these factors. Our intention is to set out how the notion of resilience might provide a new perspective to understand how constructive design research may produce results that have a sustainable social impact. The findings point towards ways in which these two different perspectives on transformation – the analytical perspective of resilience thinking and the generative perspective of constructive design research – may become complementary in both igniting and understanding transformations.

  • 15.00 – Coffee

    Time: 15.00 – 15.30, Location: Hyllan

  • 15.30 – Closing

    Time: 15.30 – 16.15, Location: Main Hall

  • 18:00 – Arts Track Performance

    Time: 18:00 (Doors Open), 19.00 (first) & 20.00 (second), Location: Reactor Hall

    • Op 1254: Music for Neutrons, Networks and Solenoids using a Restored Organ in a Nuclear Reactor

      Leif Handberg, KTH Royal Institute of Technology
      Ludvig Elblaus, KTH Royal Institute of Technology
      Chris Chafe, Stanford University

      In this paper, an installation is presented that connects Stanford and Stockholm through a one-of-a-kind combination of instrument and venue: the Skandia Wurlitzer theatre organ (Wurlitzer serial no.1254) situated in the KTH R1 Experimental Performance Space, a disused nuclear reactor. A continuous stream of musical data, audio, and video between the two places explored the capabilities of the digital to play with the concept of presence and embodiment, virtuality and the physical. In the installation, a series of performances presented new pieces written especially for this setting. The pieces were performed by musicians in Stanford, mediated in real-time, allowing them to play together with the theatre organ in Stockholm, temporarily fusing the two venues to create one ensemble, one audience, in one space.