Real-Time Haptic Rendering: Perception, Optimization, and Multi-Modal Integration
Main Article Content
Abstract
This review explores the recent advancements, challenges, and future directions in real-time haptic rendering, a technology that enables users to experience touch sensations in virtual environments through precise tactile feedback. Real- time haptic rendering plays a critical role in enhancing user interaction in applications such as Virtual Reality (VR), Augmented Reality (AR), and teleportation by providing a realistic sense of touch. The paper discusses fundamental haptic rendering techniques, including force computation methods, perception- based models, and optimization strategies like genetic algorithms and adaptive control. It highlights the role of multi-modal feedback systems in delivering immersive experiences by integrating force, vibration, and tactile feedback. Key challenges, such as latency, computational complexity, and stability, are examined alongside potential solutions, including AI-driven adaptation, cloud-based rendering, and hardware innovations. The review also presents a detailed analysis of emerging applications in fields such as medical training, gaming, and remote collaboration. Looking forward, the integration of advanced AI models, hybrid rendering techniques, and new feedback modalities promises to further enhance the realism and scalability of haptic systems, paving the way for more accessible and interactive virtual experiences.