During the last 5 years, we have successfully developed new concepts for creating photo-realistic hair and real-time hair simulation. We have extensively worked on the basic problems related to visual hair simulation with remarkable success. Now we are interested in innovative interaction modalities and their synchronization with visualization.
We propose to model and “feel” the hair with haptic devices, focusing on adaptive visuo-haptic simulation and easy interactive modeling of hair. The underlying idea is to explore ways of integrating visual hair simulation and haptics into one multirate-multilayer-multithread application allowing for intuitive interactive hair modeling. The result of the proposed work will be validated through a demonstrator allowing the user to interact with the simulated hair of a virtual human’s head through a haptic interface. By adding the sense of touch to a VR simulation, we enter the domain of multimodal perception. The proposed system will stimulate both vision and touch of the user, who will be able to see a realistic hair simulation performing at interactive rates and easily use virtual tools to model the hair style. The proposed research tackles many significant challenges in the domains of multimodal simulation, collision detection, hair simulation and haptic rendering. We take hair simulation as an application example because of our past experience in the domain and because of its suitability for interactive modeling. Moreover, the haptic simulation of hair represents a particular challenge because of the strong anisotropic dynamic properties of hair and its high requirements in terms of computational power.
MIRALab, University of Geneva