@artjng
Happy to share that my PhD paper "Scene2Hap: Generating Scene-Wide Haptics for VR from Scene Context with Multimodal LLMs" has received a Best Paper Award (top < 1%) out of 6,730 submissions at ACM CHI (@acm_chi), the most prestigious conference in the human-computer interaction field🙌 Scene2Hap is an LLM-centered system that automatically designs object-level vibrotactile feedback for entire VR scenes based on objects' semantic attributes (e.g., whether and how the object vibrates) and physical context (e.g., the object's density, spatial relationships). It then renders real-time haptic feedback across the scene, calculating vibration propagation based on LLM-inferred material properties. To the best of our knowledge, this is the first paper to address the problem itself: "designing haptic characteristics of a whole VR scene with one click." Thanks a lot to my co-first-author @EasaAliAbbasi, Sara Safaee, @FKeeL1, and my advisor Jürgen Steimle!