As image processing and computer vision(CV) technology become available in real time, Augmented reality(AR) technology enables us to more interact with Real world through the augmented data. As this paradigm, we can interact with everyday object through augmented data such as facial expression. This work describes use of CV and AR to interact with a everyday object as making them animated. For smart device application purpose to interact with everyday object, we developed touch event based object tracking algorithm using the camshift , grabcut and particle filter hybrid system. The Poisson image edit was used for facial expression blending to make look like the object had a facial expression inheritted. To make the system fun to use, we adoptted the Tamagotchi storytelling concept, which is a virtual pet simulator. For multi-platform game application purpose and computer vision programming, the AR programming environment was built using the cocos2d-x game engine and OpenCV. In experiments, users felt that AR made them feel that the everyday object was animated, had emotions and could interact with them.
![]() |
Opening Keynote Chris Harrison The Rich-Touch Revolution is Coming |
![]() |
Closing Keynote Eric Paulos Hybrid Ecologies: New Stratagems for Computing Culture |