The field of brain-computer interfaces (BCIs) and assistive technologies is rapidly evolving, with a focus on developing more practical, wearable, and user-friendly solutions. Recent developments have explored the use of in-ear electrodes for SSVEP-based BCIs, which have shown promising results in terms of feasibility and enhanced practicability. Additionally, hybrid systems integrating SSVEP and P300 paradigms have demonstrated improved classification accuracy and information transfer rates. In the realm of assistive technologies, researchers have been investigating the impact of embodiment on the effectiveness of devices, such as head-mounted and hand-held systems, for individuals with blindness or low vision. These studies have highlighted the importance of considering biomechanical measures and user experience in the design of assistive technologies. Other notable advancements include the development of design spaces for on-body feedback, such as FlexGuard, which aims to support injury prevention in strength training, and multimodal assistive mobile applications, like NaviSense, which combines conversational AI, vision-language models, and LiDAR to support object retrieval for persons with visual impairments. Noteworthy papers include:
- The Dual-Mode Visual System for Brain-Computer Interfaces, which presents a novel LED-based dual stimulation apparatus that integrates SSVEP and P300 paradigms, achieving a mean classification accuracy of 86.25%.
- NaviSense, a mobile assistive system that combines conversational AI, vision-language models, augmented reality, and LiDAR to support open-world object detection with real-time audio-haptic guidance, significantly reducing object retrieval time for blind and low-vision participants.