The field of extended reality (XR) is rapidly evolving, with a growing focus on developing innovative interaction techniques and integrating large language models (LLMs) to enhance user experience. Recent research has explored the potential of combining traditional input devices with hand-tracking and gestural inputs to create more expressive and intuitive interactions. Additionally, the integration of LLMs in XR is transforming the way users interact with immersive environments, enabling more realistic and engaging experiences. Noteworthy papers in this area include HandOver, which presents a novel interaction technique for precise selection and manipulation of 3D objects, and How LLMs are Shaping the Future of Virtual Reality, which provides a comprehensive review of the intersection of LLMs and VR. Other notable papers, such as Navigation Pixie and Guided Reality, demonstrate the potential of LLMs in enabling on-demand navigation agents and generating visually-enriched AR task guidance. Overall, these advancements are pushing the boundaries of what is possible in XR and paving the way for more immersive, interactive, and intelligent digital experiences.
Advancements in Extended Reality Interaction and Large Language Models
Sources
Navigation Pixie: Implementation and Empirical Study Toward On-demand Navigation Agents in Commercial Metaverse
Text2VR: Automated instruction Generation in Virtual Reality using Large language Models for Assembly Task