Single Template

YouTouch!

While large display walls are support multiple users interacting at the same time, most system can’t recognize which touch belongs to which user. YouTouch! tracks users interacting in front of the wall can distinguished touches of each individual user. It therefore enables applications to use personalizes touch support. To demonstrate this principle, we developed a multi user paint application where each user has their own individual color palette. It was developed at the Interactive Media Lab Dresden and resulted in a publication at the ACM AVI 2016.

General

YouTouch! was developed by a team of three, including me.

The systems consists of several applications. The users are tracked using the Microsoft Kincet. Because the Kinect loses the Id of people whenever they leave its field of view, further processing is done by an application running on a dedicated tracking PC. It is written in C++ and identifies persons by comparing color histograms of body parts and skeleton-based biometric measurements. Short-term occlusion of users is also handled. The result is sent to another component running on the display wall, also written in C++. It uses a combination of the skeleton and image data to associate touches to specific users. The resulting personalized touches are then send to the client applications using the TUIO protocol, which we extended to support the additional information. The client application can be written in any framework. Our multi-user paint is written in Python using libavg. YouTouch! requires no user instrumentation nor custom hardware, and there is no user registration nor learning phase.