Pictures
Here you find a selection of pictures from the conference. (If you find a picture
you don't want to have there, please tell us.)
Timetable
Jump to: Keynote
Paper Session 1, Paper Session 2,
Paper Session 3,
Paper Session 4, Paper Session 5,
Paper Session 6, Paper Session 7,
exhibits/demos, posters
Constitution of the German Working Group on Tangibles to top
Monday, Feb. 18, 10:00-13:00 (in German)
Gründung eines Arbeitskreises "Be-greifbare Oberflächen in Gemischten Wirklichkeiten" im Fachbereich Mensch-Computer-Interaktion in der Gesellschaft für Informatik (GI). Eine Mitarbeit ist auch für Nichtmitglieder der GI möglich.
Weitere Informationen finden Sie hier: AK_BI+GeWi.pdf
Tangible Bits: Beyond Pixels
Hiroshi Ishii (MIT Media Lab, USA)
Tangible user interfaces (TUIs) provide physical form to digital information and computation, facilitating the direct manipulation of bits. Our goal in TUI development is to empower collaboration, learning, and design by using digital technology and at the same time taking advantage of human abilities to grasp and manipulate physical objects and materials. This paper discusses a model of TUI, key properties, genres, applications, and summarizes the contributions made by the Tangible Media Group and other researchers since the publication of the first Tangible Bits paper at CHI 1997. http://tangible.media.mit.edu/
MOBILE AND TANGIBLE INTERACTION (Paper Session 1, Chair: Nicolas Villar) to top
AudioCubes: a Distributed Cube Tangible Interface based on Interaction Range for Sound Design
Jean Vanderdonckt (Université catholique de Louvain, BE); Bert Schiettecatte (PERCUSSA, BE)
AudioCubes is a novel tangible user interface allowing any
person interested by sound design such as sound creators,
and music trainers to intuitively explore and create dynamically
changing sound. A new sound is created by manipulating
distributed cube tangible user interface that can be
coupled wirelessly by locating them in the interaction range
of each other on a table. At any time, a sound processing
network combines operational properties of AudioCubes,
such as location on a plane or in space, movement, arrangement
with other cubes, and layout. Sound algorithm
parameters and the configuration of the sound processing
network can be changed simultaneously, allowing a fast and
convenient exploration of sound creation space that creates
a new interaction technique for creating sounds.
Gesture Recognition with a Wii Controller
Thomas Schlömer (University of Oldenburg, DE); Benjamin Poppinga (University of Oldenburg, DE); Niels Henze (OFFIS Research Institute of Information Technology, DE); Susanne Boll (University of Oldenburg, DE)
In many applications today user interaction is moving away from mouse and pens and is becoming pervasive and much more physical and tangible. New emerging interaction technologies allow developing and experimenting with new interaction methods on the long way to providing intuitive human computer interaction. In this paper, we aim at recognizing gestures to interact with an application and present the design and evaluation of our sensor-based gesture recognition. As input device we employ the Wii-controller (Wiimote) which recently gained much attention world wide. We use the Wiimote's acceleration sensor independent of the gaming console for gesture recognition. The system allows the training of arbitrary gestures by users which can then be recalled for interacting with systems like photo browsing on a home TV. The developed library exploits Wii-sensor data and employs a hidden Markov model for training and recognizing user-chosen gestures. Our evaluation shows that we can already recognize gestures with a small number of training samples. In addition to the gesture recognition we also present our experiences with the Wii-controller and the implementation of the gesture recognition. The system forms the basis for our ongoing work on multimodal intuitive media browsing and are available to other researchers in the field.
Studying Applications for Touch-Enabled Mobile Phone Keypads
Paul Holleis (University of Duisburg-Essen, DE); Jonna Häkkilä (Nokia Group, FI); Jussi Huhtala (Nokia Research, FI)
We present a platform to evaluate mobile phone
applications that make use of an additional dimension for
key presses. Using capacitive sensors on each key, merely
touching buttons as well as the force of the press can be
measured. A set of applications well known from current
mobile phones has been extended with functionality
exploiting those new possibilities. The results of a study
undertaken with this prototype are presented and
conclusions are drawn for the design and implementation of
such applications.
Using Actuated Devices in Location-Aware Systems
Mike Fraser (University of Bristol, UK); Kirsten Cater (University of Bristol, UK); Paul Duff (University of Bristol, UK)
Location-aware systems have traditionally left mobility to
the user through carrying, supporting and manipulating the
device itself. This design choice has limited the scale and
style of device to corresponding weight and form
constraints. This paper presents a project introducing school
children to location aware systems. We observed that it is
hard to notice, physically grasp and simultaneously share
these small personal devices in groups. These behaviours
are partly grounded in the physical device design, but also
in the location awareness model itself, which provides
information ‘right here’ while the children are looking
around and about them. These observations lead us to
suggest the alternative model of pointing at locations so that
they can be noticed and experienced by groups in public
places. We further build this location model into the device
itself by introducing actuated components from robotics to
make a location-aware device called ‘Limbot’ that can be
physically pointed. A preliminary study of the Limbot with
the school children indicates rich sharing behaviours, but
that user control of actuation at all points is critical to the
ultimate success of our approach, and further exploration of
our location model is required.
MAKING TANGIBLE INTERACTION WORK (Paper Session 2, Chair: Ali Mazalek) to top
Posey: Instrumenting a Poseable Hub and Strut Construction Toy
Michael Weller (Carnegie Mellon University, US); Ellen Yi-Luen Do (Georgia Institute of Technology, US); Mark Gross (Carnegie Mellon University, US)
We describe Posey, a computationally-enhanced hub-and-strut construction
kit for learning and play. Posey employs a ball and socket
connection that allows users to move the parts of an assembled model.
Hubs and struts are optocoupled through the ball and socket
joints using infrared LEDs and photosensors. Wireless transmitters
in the hubs send connection and geometry information to a host
computer. The host computer assembles a representation of the
physical model as the user creates and configures it. Application
programs can then use this representation to control computational
models in particular domains.
Strengths and Weaknesses of Software Architectures for the Rapid Creation of Tangible and Multimodal Interfaces
Bruno Dumas (University of Fribourg, CH); Denis Lalanne (University of Fribourg, CH); Dominique Guinard (SAP Research / ETH Zurich, CH); Reto Koenig (University of Fribourg, CH); Rolf Ingold (University of Fribourg, CH)
This paper reviews the challenges associated with the
development of tangible and multimodal interfaces and
exposes our experiences with the development of three
different software architectures to rapidly prototype such
interfaces. The article first reviews the state of the art, and
further compares existing systems with our approaches.
Finally, the article stresses the major issues associated with
the development of toolkits allowing the creation of
multimodal and tangible interfaces, and presents our future
objectives.
VoodooSketch - Extending Interactive Surfaces with Adaptable Interface Palettes
Florian Block (Lancaster University, UK); Hans Gellersen (Lancaster University, UK); Michael Haller (FH Hagenberg, AT)
Voodoosketch is a system that extends interactive surfaces
with physical interface palettes on which users can
dynamically deploy controls as shortcut to application
functionality. The system provides physical ‘plug and play’
controls as well as support for sketching of controls, and
allows controls to be associated with application functions
via handwritten labels. The system uses a special digital
pen, which writes ‘real’ ink on the palettes while
functioning as a digital input device on the interactive
surfaces. The palettes can be seamlessly integrated into
existing applications, be appropriated by the user to suit
different input requirements, and support new interaction
styles across multiple surfaces, palettes and users.
DrawSound: A Drawing Instrument for Sound Performance
Kazuhiro Jo (University of Tokyo, JP)
DrawSound is a drawing instrument for sound
performances that combines multi-touch input technology
with the expressive character of drawing. The instrument is
intended to be used in two different art projects, The SINE
WAVE QUARTET, and aeo. In this paper, we describe the
implementation of DrawSound with three different pens
and show how we design the two different sound
performances. We also explain our preliminarily
observations about the unique characters of DrawSound.
HYUI - A Visual Framework for Prototyping Hybrid User Interfaces
Christian Geiger (University of Applied Science Düsseldorf, DE); Robin Fritze (University of Applied Science Duesseldorf, DE); Anke Lehmann (University of Applied Science Düsseldorf, DE); Joerg Stoecklein (University of Paderborn, DE)
This paper describes a pragmatic approach for the design of
hybrid user interfaces based on a number of extensions of
an existing 3D authoring system. We present the design and
realization of a visual framework dedicated to the
prototyping of hybrid user interfaces. The rapid
development environment was applied in a teaching context
during lectures on advanced user interface design. The
results showed that our framework provides a suitable tool
to quickly design and test hybrid user interfaces.
SUPPORTING REAL-LIFE APPLICATIONS (Paper Session 3, Chair: Michael Rohs) to top
Integration of Virtual and Real Document Organization
Thomas Seifried (Upper Austria University of Applied Sciences, AT); Matt Jervis (University of Waikato, NZ); Michael Haller (FH Hagenberg, AT); Masood Massodian (University of Waikato, NZ); Nicolas Villar (Lancaster University, UK)
In most work environments people archive both the real and
digital versions of their documents. But unlike the digital
world, in the physical world locating a document can become
a very time consuming task. The reason for this is the
lack of a direct connection between the physical and digital
versions of documents.
The Smart Filing System combines the benefits of the digital
and the real world providing an augmented filing cabinet.
The system benefits by linking the physical world with the
digital desktop world. In our setup, we implemented an addin
for MS OneNoteTM. Furthermore, real folders and cabinets
are extended with devices for direct input and output.
This allows us to search and browse through digital documents
using MS OneNoteTM. Simultaneously, our system
also allows a feedback in the physical world, by highlighting
the corresponding folder in the filing cabinet. In this paper
we describe the hardware and software implementation our
prototype system, and present the results of a preliminary
pilot study of its use.
GeoTUI: A Tangible User Interface for Geoscience
Nadine Couture (ESTIA, FR); Guillaume Riviere (ESTIA and LaBRI), FR); Patrick Reuter (LaBRI, FR)
GeoTUI is a system designed for geophysicists that
provides props as tangible user interface on a tabletop
vision-projection system for the selection of cutting planes
on a geographical map of a subsoil model. Our GeoTUI
system allows the geophysicists to manipulate in the same
action and perception space since the movement of the
physical artifacts is done on the tabletop and thus
constrained to two dimensions. Consequently, it combines
the advantages of the spontaneous conditions of user
interaction that the geophysicists are commonly used to in
their classical paper/pen/ruler environment with the
advantages of the use of powerful geological simulation
software. We conducted an extensive user study in the
workplace of the geophysicists that clearly revealed that
using a tangible interaction performs better than using the
standard mouse/keyboard GUI for the cutting line selection
task on a geographical subsoil map. Consequently, it
increases the efficiency for the real-world trade task of
hypothesis validation on a subsoil model. Moreover, this
geological user case is complex enough to confirm the
hypothesis that in space-multiplex conditions, specialized
devices perform better than generic ones.
The ColorTable - A Design Story
Valerie Maquil (Vienna University of Technology, AT); Thomas Psik (Vienna University of Technology, AT); Ina Wagner (Vienna University of Technology, AT)
The paper describes the design story of the ColorTable, a
tangible user interface in support of urban planners and
diverse stakeholders collaboratively envisioning urban
change, which was developed in an iterative process of
design-evaluation-feedback-redesign in a series of
workshops with users in the context of real urban planning
projects. It seeks to clarify a number of more general design
issues related to tangible user interfaces – how to make use
of material and spatial properties in designing both,
physical interface and multiple and simultaneous
interactions; how to handle the complexity of urban projects
while keeping interfaces and interactions simple and
transparent.
Squeeze, Rock, and Roll; Can Tangible Interaction with Affective Products Support Stress Reduction?
Miguel Bruns Alonso (TU Delft, NL); David Keyson (ID-StudioLab, TU Delft, NL); Caroline Hummels (Technische Universiteit Eindhoven, NL)
In this paper we explain that we focus on tangible
interaction, because the physical world is inherently
meaningful for people, i.e. we perceive the world in terms
of what we can do with it, in terms of our skills. By
physically interacting with the world this meaning
emanates. We elucidate this principle by means of Escale,
a tangible device to enter answers on questionnaires
into a computer. Meaning is created by coupling the
graphical layout of the scales on the questionnaires, to the
layout of buttons on E-scale, and by enabling to slide
down E-scale along the scales while entering data. The
results from our experiment show that unity of location
and time, increasing bandwidth by controlling multiple
parameters simultaneously and physical learning and thus
development of bodily skills, increase usability (reduce
time) as well as experience (overall satisfaction). We
hope that sharing the rationale behind our TEI designs
and research might contribute to the discussion about the
strengths and weaknesses of TEI.
Turning a Page on the Digital Annotation of Physical Books
Chih-Sung (Andy) Wu (Georgia Institute of Technology, US); Susan Robinson (Georgia Institute of Technology, US); Ali Mazalek (Georgia Institute of Technology, US)
The Graphical User Interface (GUI) has created an efficient
work environment for many applications. However, when
users are confined by keyboards and mice, they lose the
ability to interact with the virtual world using habits from
the real world. Our research examine how emerging modes
of authorship, such as wikis, can be used to generate new
possibilities for bringing atoms and bits together for digital
annotation. Our goal is to combine the everyday habits in
reading books with emerging digital possibilities.
In this paper, we present a prototype system called
WikiTUI, which brings digital media to physical paper
books. This system allows readers to access the digital
world through fingertip interactions on books, and enables
them to share information with other readers using wiki
technology. WikiTUI not only bridges the gap between the
digital and the physical worlds, but also facilitates multiple
contributions to a reference base spanning across these
worlds. We present user evaluations of the WikiTUI
prototype and discuss research implications.
DESIGNING THE INTERACTION (Paper Session 4, Chair: Paul Holleis) to top
BounceSlider: Actuated Sliders for Music Performance and Composition
Romain Gabriel (Chalmers University of Technology, SE); Johan Sandsjö (H-interaction.com, SE); Ali Shahrokni (Chalmers University of Technology, SE); Morten Fjeld (Chalmers University of Technology, SE)
The ForceFeedback Slider (FFS) is a one-dimensional
actuated slider using a motor to produce tangible interaction
with position and force as input and output parameters. To
create a new concept, we have built a mixing desk, placed
six FFSs (two implemented here) into a partially realized
SliderBox, and added a LED and two toggle buttons to each
slider for additional interactivity. We have developed a tool
called BounceSlider for improvising music. This
application for real time music performance and
composition uses a slider handle that can act as a ball. Users
can lift and release the handle to set the ball in motion and
produce a particular sound each time it bounces against the
baseline. Based on physical characteristics, the user can
create different sounds and loops by changing two settings:
gravity (speed) and bounce type (ball physical
characteristics). BounceSlider allows the user to create and
save loops of Musical Instrument Digital Interface (Midi)
with up to five sounds at a time.
Interact, Excite, and Feel
Parisa Eslambolchilar (FIT Lab, UK); Rod Murray-Smith (University of Glasgow, UK)
This paper presents a dynamic system approach to the design
of multimodal interactive systems.We use an example where
we support human behavior in a browsing task, by adapting
the dynamics of navigation using speed-dependent automatic
zooming (SDAZ), allowing the user to switch smoothly
among different modes of control. We show how the user’s
intention is coupled to the browsing technique via the dynamic
model, and how the SDAZ method couples the document
structure to audio samples using a model-based sonification.
We demonstrate that this approach is well suited to
mobile and wearable applications, and audio feedback provides
valuable information, supporting intermittent interaction,
i.e. allowing movement-based interaction techniques to
continue while the user is simultaneously involved with real
life tasks.
LEARNING THROUGH PHYSICAL INTERACTION (Paper Session 5, Chair: Alissa Antle) to top
A representation Approach to Conceptualizing Tangible Learning Environments
Sara Price (Institute of Education, London, UK)
Tangibles, in the form of physical artefacts embedded with
sensor technologies, offer the opportunity to exploit and
build on our everyday interaction and experience with the
world, enabling new forms of engagement and access to
tools for supporting learning. The implications for learning
are considerable, potentially bringing about a radical
change in the way we conceptualise learning and learning
activities. However, we know little about the specific
learning benefits, and currently lack an effective structure
within which to establish them. Although several
frameworks have been proposed for conceptualizing
tangible environments, none highlight the central role that
external representations have in tangible environments.
This paper argues for the importance of placing primary
emphasis on representation, and the role that this might play
in mediating interaction and cognition in tangible
environments. The representation-tangible relationship is
outlined, together with their differential potentials for
learning. Based on this the paper then proposes a
conceptual framework for systematically investigating how
different ways of linking digital information with physical
artefacts influence interaction and cognition, to gain a
clearer understanding of their role for learning.
Let Me Actuate You
Bart Hengeveld (University of Technology Eindhoven, NL); Caroline Hummels (Technische Universiteit Eindhoven, NL); Kees Overbeeke (Eindhoven University of Technology, NL); Riny Voort (Viataal-Research, Development & Support (RDS), NL); Hans Van Balkom (Viataal-Research, Development & Support (RDS), NL); Jan De Moor (Radboud Universiteit Nijmegen, NL)
In this paper we focus on two aspects of Tangible
Interaction that have our particular interest: 1) the added
value of tangibility when designing interfaces for toddlers
and 2) the value of actuators. Especially the latter is
something that in our opinion has been under-investigated
within the field of Tangible and Embedded Interaction. In
this paper we will address the abovementioned topics by
giving examples from the LinguaBytes project, which is
aimed at developing an intelligent interactive play and
learning environment for toddlers with multiple disabilities.
These two aspects of Tangible Interaction have our
particular interest since we see that multi-handicapped
children could benefit highly from Tangible Interaction, but
often lack the necessary bodily skills. Using actuators could
offer these children possibilities to become more
autonomous, thus enhancing their self-esteem and
motivation. We feel that our work could not only benefit
multi-handicapped toddlers in particular, but could also be
used to design interactions that are more respectful to
heterogeneous users in general.
PROTOTYPICAL EVALUATIONS (Paper Session 6, Chair: Eva Hornecker) to top
Making Sense of Group Interaction in an Ambient Intelligent Environment for Physical Play
Ron Wakkary; Marek Hatala; Ying Jiang; Milena Droumeva; Malahat Hosseini (all from Simon Fraser University, CA)
This paper presents the results of a study on group
interaction with a prototype known as socio-ec(h)o.
socio-ec(h)o explores the design of sensing and display, user
modeling, and interaction in an embedded interaction
system utilizing a game structure. Our study involved the
playing of our prototype system by thirty-six (36)
participants grouped into teams of four (4). Our aim was to
determine heuristics that we could use to further design the
interaction and user model approaches for group and
embodied interaction systems. We analyzed group
interaction and performance based on factors of team
cohesion and goal focus. We found that with our system,
these factors alone could not explain performance.
However, when transitions in the degrees of each factor,
i.e. high, medium or low are considered, a clearer picture
for performance emerges. The significance of the results is
that they describe recognizable factors for positive group
interaction.
Action and Reaction for Physical Map Interfaces
David Chatting (BT Group PLC, UK)
In this paper we present experimental results measuring the
success of users manipulating a physical map interface to
navigate to specific locations. We evaluate four different
mappings between the action of the user and the reaction of
the display, two where axis act consistently and two where
they are inconsistent. Consistent mappings outperform
those that are not and of these one mapping is significantly
better and quicker to learn. Measures of error, speed of
completion and reports of difficulty and task enjoyment
support this. We use this to make specific recommendations
for such interfaces and highlight new challenges.
Are Tangibles More Fun? Comparing Children's Enjoyment and Engagement Using Physical, Graphical and Tangible User Interfaces
Lesley Xie (Simon Fraser University, CA); Alissa Antle (Simon Fraser University, CA); Nima Motamedi (Simon Fraser University, CA)
This paper presents the results of an exploratory
comparative study in which we investigated the relationship
between interface style and school-aged children’s
enjoyment and engagement while doing puzzles. Pairs of
participants played with a jigsaw puzzle that was
implemented using three different interface styles: physical
(traditional), graphical and tangible. In order to investigate
interactional differences between the three interface styles,
we recorded subjective ratings of enjoyment, three related
subscales, measured times and counts of behavioral based
indications of engagement. Qualitative analysis based on
observational notes and audio responses to open interview
questions helped contextualize the quantitative findings and
provided key insights into interactional differences not
apparent in the quantitative findings. We summarize our
main findings and discuss the design implications for
tangible user interfaces.
Connectibles: Tangible Social Networks
Jeevan Kalanithi (Massachusetts Institute of Technology, US); V. Michael Bove, Jr. (MIT, US)
This paper presents "Connectibles," a prototype
instantiation of a tangible social network, a new type of
social network application rooted in physical objects and
real world social behavior. This research is inspired by
theoretical work that suggests that gifts act as physical
symbols of social relationships. The Connectibles system
leverages gift-giving practices, presenting users with gift
objects ("connectibles") that they exchange with one
another. These objects automatically form always-on
communication channels between givers and receivers. As a
user collects more and more of these objects, he or she
begins to acquire a dynamic, physical representation of and
interface to her social network. The community of users’
interactions implicitly represent the structure of the social
network; these data can be accessed with a GUI application,
allowing users to explore and interact with their social
network. This system was implemented and subject to
three user studies. The overarching goal of this work is to
examine how a set of devices might naturally and
harmoniously interface the physical, virtual and social
worlds.
NEW DIRECTIONS (Paper Session 7, Chair: Elise van den Hoven) to top
Robotany and Lichtung: a Contribution to Phenomenological Dialogue
Jill Coffin (Georgia Institute of Technology, US)
This paper discusses phenomenological structures relevant
to tangible and embedded interaction through a
phenomenological interpretation of an interactive art piece.
This discussion distinguishes between two basic traditions
in phenomenological philosophy, the Husserlian and the
Heideggerian. In addition, it illustrates the notions of
Lichtung, intentionality, Verhalten, ready-to-hand and
present-at-hand. The paper concludes with some implications
of a Heideggerian phenomenological framework.
Sprout I/O: A Texturally Rich Interface
Marcelo Coelho (MIT Media Laboratory, US); Pattie Maes (MIT Media Laboratory, US)
In this paper we describe Sprout I/O, a novel haptic
interface for tactile and visual communication. Sprout I/O
combines textiles and shape-memory alloys to create a soft
and kinetic membrane with truly co-located input and
output. We describe implementation details, the affordances
made possible by the use of smart materials in human
computer interaction and possible applications for this
technology.
Towards a New set of Ideals: Consequences of the Practice Turn in Tangible Interaction
Ylva Fernaeus (Stockholm University, SE); Martin Jonsson (Sockholm University, SE); Jakob Tholander (Södertörn University College, SE)
The practice-oriented turn in social sciences has implied a
series of fundamental consequences and design challenges
for HCI in general, and particularly in tangible interaction
research. This could be interpreted as a move away from
scientific ideals based on a modernist tradition, reflected in
four contemporary themes in tangible interaction research.
The first theme concerns a shift from an information centric
to an action centric perspective on interaction. The second
concerns a broadened focus from studying properties of the
system, to instead aim at supporting qualities of the activity
of using a system. The third concerns the general shift
towards supporting sharable use, rather than primarily
individual use settings. The last theme concerns the shift
towards multiple and subjective interpretation of how to
use new technological artefacts. We discuss how these
themes are grounded in theoretical as well as more concrete
technical developments in the area of tangible computing.
Murmur: Kinetic Relief Sculpture, Multi-Sensory Display, Listening Machine
Aimee Rydarowski (Georgia Institute of Technology, US); Ozge Samanci (Georgia Institute of Technology, US); Ali Mazalek (Georgia Institute of Technology, US)
In this paper we describe the concept, design, and
implementation of Murmur, an interactive kinetic display
made of one hundred computer CPU fans. Murmur
responds to sound input from its environment via embedded
microphones to produce patterns on a reactive surface. The
reactive surface consists of hinged paper pieces situated in
front of each fan. When activated by sonic elements in the
environment, including sounds intentionally generated by
an interactor, Murmur responds by turning on and off its
fans in a sequence. The wind pressure generated by the
movement of the fans stimulates the surface, forcing the
paper up and out to create a variety of dynamic patterns.
Each pattern represents characteristics of the sonic
environment. We also analyze the feedback received from
the interactors and discuss the possible ways of making the
interaction more immersive.
Cooking up Real World Business Applications Combining Physicality, Digitality, and Image Schemas
Joern Hurtienne (Technische Universität Berlin, DE); Johann Habakuk Israel (Fraunhofer-Institute for Production Systems and Design Technology IPK, DE); Katharina Weber (FHTW Berlin, DE)
Tangible interaction research has opened up new ways to
interact with computers and extended our imagination of
what is possible with digital systems. However, research on
tangible user interfaces (TUI) seems to have lost sight of
the everyday situation of the majority of people who still
work with standard computer systems. This paper investigates
a design process for applying TUI in a GUI dominated
domain while preserving the functionality of the traditional
systems. We exemplify a user centered design
process using (1) image schemas as a meta-language for
analysis and design and (2) a systematic function allocation
of digital and physical user interface elements. We demonstrate
this process in the context of the redesign of an invoice
verification and posting system of a German beverage
company.
D01: A TOUCHING HARMONY: MIDAS in Artistic Practice
Kevin Muise (Simon Fraser University, CA); Ji Dong Yim (Simon Fraser University, CA)
Advances in technology provide artists with new
opportunities for developing engaging works, however
these often push the artist’s technical ability. We address
this issue in demonstrating that MIDAS can be applied to
artistic practices with relative ease and at a low cost. The
authors present, as a case study, A Touching Harmony – a
flexible screen-based installation that explores depth as
variable within user-interaction. We discuss the use of
MIDAS within the implementation of the installation.
D02: ActionCube - A Tangible Mobile Gesture Interaction Tutorial
Jukka Linjama (Nokia, FI); Panu Korpipää (Finwe Ltd., FI); Juha Kela (Finwe Ltd., FI); Tapani Rantakokko (Finwe Ltd., FI)
This study addresses the issue of how to aid adoption of
new interaction means for mobile devices. The research
problem is how to promote and guide the use of new
movement interaction modalities to a novice user, who has
no prior knowledge of gesture control. The aim was to
create a pleasurable experience that invites users to learn
how mobile device movement control works. The main
contribution is an interaction tutorial application that
combines gesture control with a physical visual tangible
object in a mobile device, demonstrating interaction
elements that are potentially applicable in future mobile
devices.
D03: Back to the sandbox - Playful Interaction with Granules Landscapes
Steffi Beckhaus (University of Hamburg, DE); Roland Schroeder-Kroll (University of Hamburg, DE); Martin Berghoff (University of Hamburg, DE)
We present a novel, tangible interface demonstrated by
means of the artwork, GranulatSynthese, an installation for
the intuitive, tangible creation of ambient, meditative audiovisuals.
The interface uses granules distributed over a
tabletop surface and combines them with rear-projected
visuals and dynamically selected sound samples. The haptic
landscape can be explored with the hands, shaped into both
hills and open space and composed intuitively. Form,
position, and size of cleared table areas control parameters
of the computer generated audio-visuals. GranulatSynthese
is a meditative application, which invites to either play or
step back, watching the visuals and sounds evolve. The
installation has proven very accessible. It is inviting and
absorbing for a long time for many visitors to the
installation.
D04: E-scale: unity of location and time, increasing bandwidth and enhancing physical learning does matter
Caroline Hummels (Technische Universiteit Eindhoven, NL); Kees Overbeeke (Eindhoven University of Technology, NL)
In this paper we explain that we focus on tangible
interaction, because the physical world is inherently
meaningful for people, i.e. we perceive the world in terms
of what we can do with it, in terms of our skills. By
physically interacting with the world this meaning
emanates. We elucidate this principle by means of Escale,
a tangible device to enter answers on questionnaires
into a computer. Meaning is created by coupling the
graphical layout of the scales on the questionnaires, to the
layout of buttons on E-scale, and by enabling to slide
down E-scale along the scales while entering data. The
results from our experiment show that unity of location
and time, increasing bandwidth by controlling multiple
parameters simultaneously and physical learning and thus
development of bodily skills, increase usability (reduce
time) as well as experience (overall satisfaction). We
hope that sharing the rationale behind our TEI designs
and research might contribute to the discussion about the
strengths and weaknesses of TEI.
D05: HYUI - A Visual Framework for Prototyping Hybrid User Interfaces
Christian Geiger (University of Applied Science Düsseldorf, DE); Robin Fritze (University of Applied Science Duesseldorf, DE); Anke Lehmann (University of Applied Science Düsseldorf, DE); Joerg Stoecklein (University of Paderborn, DE)
This demo accompanies the paper with the same name, see here.
D06: SpeakCup: Simplicity, BABL, and Shape Change
Jamie Zigelbaum (MIT Media Lab, US); Angela Chang (MIT, US); James Gouldstone (MIT Media Lab, US); Joshua Jen Monzen (MIT, US); Hiroshi Ishii (MIT Media Lab, US)
In this paper we present SpeakCup, a simple tangible
interface that uses shape change to convey meaning in its
interaction design. SpeakCup is a voice recorder in the
form of a soft silicone disk with embedded sensors and
actuators. Advances in sensor technology and material
science have provided new ways for users to interact with
computational devices. Rather than issuing commands to a
system via abstract and multi-purpose buttons the door is
open for more nuanced and application-specific
approaches. Here we explore the coupling of shape and
action in an interface designed for simplicity while
discussing some questions that we have encountered along
the way.
D07: Tangible Menus and Interaction Trays: Core tangibles for common physical/digital activities
Brygg Ullmer; Rajesh Sankaran; Srikanth Jandhyala; Blake Tregre; Cornelius Toole; Karun Kallakuri; Christopher Laan; Matthew Hess; Farid Harhad; Urban Wiggins; Shining Sun (all from Louisiana State University, US)
We introduce core tangibles: physical interaction elements
which serve common roles across a variety of tangible and
embedded interfaces. We describe two such tangibles: tangible
menus and interaction trays. These may be composed
together to dynamically bind discrete and continuous interactors
to various digital behaviors. We discuss our approach,
implementation, and early usage experiences.
D08: WiiArts: Creating collaborative art experiences with WiiRemote interaction
Hyun-Jean Lee; Hyungsin Kim; Gaurav Gupta; Ali Mazalek (all from Georgia Institute of Technology, US)
WiiArts is an experimental video, audio and image processing art
project that makes use of pre-existing sensing technologies provided
by Nintendo WiiRemotes and a Sensor Bar. Currently, most
WiiRemote-based physical interactions have been designed to mimic the
gesture of body movement in sports and other action-based games. These
Wii games are generally competitive in nature, and players interact by
responding to predefined interaction rules in either a single-user or
multi-user mode. Making use of the WiiRemote as a pre-existing
tangible and embedded interface, we explore applications that can
engage participants in active and expressive art creation in a
collaborative manner. In this paper, we describe several prototype
applications based on this concept: Illumination (draWiing), Beneath
(Waldo), WiiBand, Time Ripples. In these applications, three
interactors can work together to compose both images and sounds.
D09: XENAKIS - Combining Tangible Interaction with Probability-Based Musical Composition
Markus Bischof, Bettina Conradi, Peter Lachenmaier, Kai Linde, Max Meier, Philipp Pötzl, Elisabeth André (Augsburg, DE)
In this paper we present the table-based tangible interface
application Xenakis which uses probability models in order
to compose music in a way that can be strongly influenced
by the user. Our musical sequencing application is based on
a framework for tangible interfaces with an architecture
that is strongly inspired by the model-view-controller
pattern. In addition, we developed a hardware setup for
tangible interfaces and used MatraX for tracking markers.
The sequencer is the first implementation based on this
framework. It allows users to create music simply by
moving tangibles on the table. The graphics engine
Horde3D is used to visualize the user-interaction and to
show the relationships between the tangible objects on the
table, creating an appealing audio-visual experience. An
evaluation with 37 first time users was conducted in order
to discover the strong and the weak points of such tangible
user interfaces, especially in the context of our application.
D10: Gesture Recognition with a Wii Controller
Thomas Schlömer (University of Oldenburg, DE); Benjamin Poppinga (University of Oldenburg, DE); Niels Henze (OFFIS Research Institute of Information Technology, DE); Susanne Boll (University of Oldenburg, DE)
This demo accompanies the paper with the same name, see here.
|
P01: A Malleable Physical Interface for Copying, Pasting, and Organizing Digital Clips
Florian Block (Lancaster University, UK); Nicolas Villar (Lancaster University, UK); Hans Gellersen (Lancaster University, UK)
We present a system that extends a typical workstation
environment with a malleable physical interface for
working with digital clips. It allows users to pick digital
clips, give each its own dedicated key for direct access, and
combine keys dynamically on a physical surface in a way
that inherently reflects the state of an extended clipboard.
The system affords copying and pasting of multiple clips
each directly accessible through its own key shortcut. The
keys can also be dynamically re-arranged to organize clips,
and taken from workstation to another to transport clips,
acting simultaneously as token and as copy-paste-interface
for a digital object.
P02: A tangible interface for browsing digital photo collections
Shuo-Hsiu Hsu (Orange Labs, FR); Sylvie Jumpertz (Orange Labs, FR); Pierre Cubaud (Conservatoire National des Arts et Metiers, FR)
We present a design concept of a tangible user interface
for browsing image contents. A layer structure for image
presentation and three kinematical gestures are proposed
to facilitate navigation in the digital photo collections. We
describe how gestures support the photo browsing and
how the visual display is synchronized with gestures.
P03: Black Box: Exploring Simple Electronic Interaction
Kristina Andersen (STEIM, NL)
This paper proposes a series of simple interactive boxes
designed to investigate children’s experience and
understanding of abstract electronic interaction. The black
box project is the first step of an investigation into the
width of the potential uses of electronic sensing devices.
P04: Contact Management on the Wall: A Card-Game Metaphor for Large Displays
Tom Gross (Bauhaus-University Weimar, DE)
Tangible and embedded computing brings technology
integrates digital technology in the physical environment of
everyday life. Thereby, families in private households are
increasingly researched and supported. In this paper we
present the concept and implementation of the
FamilyFaces—a contact management tool supporting
families when managing their contacts and information
disclosure, and we report on initial user feedback.
FamilyFaces is based on a card-game metaphor on large
displays to provide wide-spread access to family members,
from teenagers to grandparents.
P05: Control of Data Flow and Configurations within Inter-appliance Using a Camera-Phone
Satoru Mitsui (Waseda University, JP)
In recent studies, user interaction models for connecting
information appliances to each other using real-world
tangible interaction are limited to specified scenarios or
applications. We developed a novel model for connecting
appliances in every scenario based on the extension of the
drag-and-drop interaction metaphor, which has proved
acceptable to users. By applying this model with a cameraphone,
we propose an interaction technique that is able to
realize the direction of the data flow between connected
appliances. One of the most important differences from
existing research is that our technique enables users not
only to use data files contained by one appliance on another
appliance, but also to associate and configure networked
appliances to each other, in a tangible manner.
P06: DrawSound: A Drawing Instrument for Sound Performance
Kazuhiro Jo (University of Tokyo, JP)
This poster accompanies the paper with the same name, see here.
P07: Inquiring Materials for Tangible Prototyping
Alissa Antle (Simon Fraser University, CA)
As TUI research moves from technical to empirical studies
which explore theoretical claims, it is important for
researchers to be able to quickly and easily build low
fidelity (lo-fi) prototypes to explore the unique features of
interaction that TUIs offer. Currently, the best practices for
choosing prototyping materials are vague at best. In this
paper, I present an analysis of the role of materials in
inquiry and propose a set of criteria for evaluating the
suitability of lo-fi prototyping materials.
P08: Marble Track Audio Manipulator: A Tangible User Interface for Audio Composition
Alex Bean; Sabina Siddiqi; Anila Chowdhury; Billy Whited; Orit Shaer; Robert Jacob (all from Tufts University, US)
We created a tangible user interface that allows children to
create musical compositions through constructive play. Our
Marble Track Audio Manipulator (MTAM) is an
augmented marble tower construction kit where marbles
represent sound clips and tracks represent different sound
effects. To create musical compositions, children
collaboratively build a marble tower and then play their
compositions by dropping marbles into the tower. As
marbles roll through the tower children can interact with
the marbles and thus improvise and alter their musical
compositions. By augmenting a popular toy, physically
representing sound clips and effects as well as allowing
improvisation, the MTAM system provides children with a
creative, playful, and engaging encounter with music.
P09: Pragmatic Haptics
Angela Chang; James Gouldstone; Jamie Zigelbaum; Hiroshi Ishii (all from MIT Media Lab, US)
This paper explores situations in which interfaces may be improved or simplified by switching feedback modalities. Due to availability of and familiarity with audio/visual technologies, many interfaces provide feedback via audio/visual pathways when a haptic pathway would best serve. The authors present a series of interface designs in which simple and inexpensive choices allow for reduction of cognitive complexity by allowing mental simplicity rather than technological familiarity to dictate design of information transmission.
P10: RENATI: Recontextualizing Narratives for Tangible Interfaces
Ayoka Chenzira (Georgia Institute of Technology, USA); Yanfeng Chen (Georgia Institute of Technology, USA); Ali Mazalek (Georgia Institute of Technology, USA)
RENATI is an acronym for recontextualizing narratives for
tangible interfaces. It serves as an umbrella term for our
art/research experiments within a hybrid environment that
uses oral narratives, and non-generative and immersive art
with sensing technologies to create tangible narratives. In
this paper we introduce our first prototype, which uses a
custom-built mannequin to allow viewers to engage with a
multi-viewpoint story titled Flying Over Purgatory.
P11: Tangible Design Support System Using RFID Technology
Takuma Hosokawa (NTT Comware, US); Yasuhiko Takeda; Norio Shioiri; Mitsunori Hirano; Kazuhiko Tanaka (NTT Comware, Japan)
We introduce a tangible design support system using RFID
technology. This system allows users to design their houses
on their own through tangible objects. Building a new
house is a big project for many people, and everyone
dreams about freely designing their own house. However,
in general, it is difficult to realize this without architectural
knowledge and a high level of computing skills. Our system
employs tangible user interfaces for a user who has limited
or no knowledge to design a new house and investigate the
design. For example, the user can design room layout by
paving the design table with special tiles and customize the
room’s properties such as wall, floor, or even furniture by
placing miniatures on the table. Through this design process,
the user can identify crucial design elements based on their
preferences and make the right decision as well as allow an
architect to understand the user’s preferences.
P12: The Robot is the Program: Interacting with roBlocks
Eric Schweikardt (Carnegie Mellon University, USA); Mark Gross (Carnegie Mellon University, USA)
The roBlocks construction kit is a tangible concurrent
programming environment that encapsulates sensory,
kinetic, and computational behavior in modular
building block units that snap together to construct
robots. The choice of a protocol for propagating values
through the constructed robot affects its behavior.
|
Pre-Conference Activities to top
Visit with us "Kunst Museum Bonn", "Haus der Geschichte" and/or "Deutsches Museum Bonn" on Sunday, Feb. 17, afternoon.
Haus der Geschichte
Our guided tour starts at 14:00, please meet us at about 13:45 in the entrance hall of the museum.
Take the tram/underground train from the central station line 16, 63 and 66, stop Heussallee.
For more travel information see google maps and
VRS (public transport Bonn)
Arithmeum Bonn
Our guided tour starts at 14:00, please meet us at about 13:45 in the entrance hall of the museum.
It is a five minute walk from central station to the museum or take the tram/underground train, stop Universtität Markt.
For more travel information see google maps and
VRS (public transport Bonn).
Kunst Museum Bonn
Our guided tour starts at 16:00, please meet us at about 15:45 in the entrance hall of the museum.
Take the tram/underground train from the central station line 16, 63 and 66, stop Heussallee.
For more travel information see google maps
and VRS (public transport Bonn).
Deutsches Museum Bonn - Research and Technology in Germany after 1945
Please meet us at 16:00 in the entrance hall.
Take the bus line 610 from the central station, stop Kennedyallee, address Ahrstr. 45
For more travel information see google maps
and VRS (public transport Bonn).
All those events are free of charge for you!
If you plan to attend one of those, please send a note to Dagmar Kern (it is not binding but helps us in planning).
|