MUMBAI, India, Feb. 13 -- Intellectual Property India has published a patent application (202541124586 A) filed by Jerusalem College Of Engineering, Chennai, Tamil Nadu, on Dec. 10, 2025, for 'autonomous multi-mode wheelchair with hybrid vision ultrasonic threat anticipation engine.'
Inventor(s) include Dr. J. Samuel Manoharan; Dr. R. Anitha; V. Subha Ramya; S. Purnima; S. Lavanya; B, Nivetha; Kifa Khairunnisa; Priscilla Sharlet Asha; and R. Jaswanth.
The application for the patent was published on Feb. 13, under issue no. 07/2026.
According to the abstract released by the Intellectual Property India: "The current invention reveals an Autonomous Multi-Mode Wheelchair with an Engine Hybrid Vision-Ultrasonic Threat Anticipation Engine that would make the wheelchair more mobile, safe, and independent and usable by physically impaired users. The system uses dual sensor system comprised of stereo vision cameras and a multi directional ultrasonic sensor array that allows the acquisition of long range and short-range environmental information. The inputs are then fed to a Spatial Fusion Processor, which does synchronize multi-modal fusion so that a single three- dimensional environmental map can be created that is capable of effectively displaying obstacles, terrain differences, changes in elevation, and moving objects. The Threat Anticipation Engine is another important part of the invention and it uses a probabilistic risk-scoring rnodel to analyze the motion vectors, predicted trajectories, and hazard probabilities. The system anticipates any possible collisions or unsafe interactions and they are prevented by establishing a Threat Probability Index (TPI) on the identified objects. To make sure that the wheelchair passes through the indoor and outdoor space safely, a prediction is interpreted by an Autonomous Navigation Controller which can then be used to modulate movement of the wheelchair by using differential motor control, controlled deceleration or path deviation. The wheelchair has three operating types: Safe-Assist, Semi-Autonomous and Fully Autonomous to enable the user to choose a type of control level that meets the physical capabilities of the users and conditions of the situation. User Interface Module is an interface that allows easy interaction and manual override using either joystick input, voice input, gestures, head-tracking, or touchscreen. The system also has redundant braking, powerful power management and fail-safe operations to make sure that the system is reliable in any situation. Through a combined predictive hazard mitigation, hybrid sensing, and intelligent navigation, the invention much lessens the workload of the user, and forms a high- quality assistive mobility platform relative to the traditional reactive wheelchair models."
Disclaimer: Curated by HT Syndication.