MUMBAI, India, Oct. 24 -- Intellectual Property India has published a patent application (202441031032 A) filed by Vel Tech Multi Tech, Dr. Rangarajan, Dr. Sakunthala, Engineering College; Dr. M. Rajesh Khanna; Swetha. M; Logitha. B; and Kaviya. S, Chennai, Tamil Nadu, on April 18, 2024, for 'an assistive vision for the visually impaired person using attention net and efficient b3 algorithm.'

Inventor(s) include Dr. M. Rajesh Khanna; Swetha. M; Logitha. B; and Kaviya. S.

The application for the patent was published on Oct. 24, under issue no. 43/2025.

According to the abstract released by the Intellectual Property India: "The innovation Worldwide, 1 billion people have preventable or untreated vision disabilities. This includes 1 billion people with nearsightedness or blindness or visual impairment and vision loss due to untreated presbyopia (826 million). In terms of neighborhood differences, it is estimated that nearsightedness in low and middle income neighborhoods is higher than in high income neighborhoods Regarding nearsightedness, it is estimated that people a unsupervised remote viewing rates are 80% in the West, East, and Central They are reported in Africa, Western Europe, and comparable rates in high-income regions are less than 10% in the Asia Pacific. Growing Population and aging are increasing the risk of mass vision loss. To facilitate the blind in this task, we will_use a deep leaning algorithm such as EfficientNet B3 to decode the image for the blind where blind can learn the recognition, distance and location of the object in. Through EfficientNet B3 algorithms and tokenization techniques used a detailed image coding system where the machine learns scenes with different captions and the processor recognizes and predicts each time the camera captures an image. Highlights are also detected, and distances are found from the digicamera. Particularly one that aims to predict the intent of a person in front of them based on actions and expressions, necessitates a comprehensive and sophisticated approach. This main feature, distinguishing between someone holding a knife for benign purposes like cutting an apple versus potential harm, involves the integration of various technologies and methodologies. After the detection, it is transmitted to an alexa microphone which will provide the user with audio that can help them determine the distance and location of the object. So, with this project we are introducing the prosthetic to the blind, which can help them feel confident when traveling alone."

Disclaimer: Curated by HT Syndication.