Education, Science, Technology, Innovation and Life
Open Access
Sign In

Research on the design and realization of interactive wearable blindness guidance system based on computer vision

Download as PDF

DOI: 10.23977/acss.2025.090114 | Downloads: 27 | Views: 483

Author(s)

Hongyan Hou 1, Yaqi Zhang 1, Danni Man 1

Affiliation(s)

1 School of Medical Information Engineering, Shandong University of Traditional Chinese Medicine, Jinan, 250021, China

Corresponding Author

Hongyan Hou

ABSTRACT

Aiming at the problems of single function of navigation aids and limited applicable scenes for visually impaired people traveling at present, this paper proposes an interactive wearable guide system based on computer vision technology. The system integrates a variety of sensors, including infrared rangefinder, ultrasonic sensors and high-definition camera, and is equipped with STM32 microcontroller for efficient data processing, realizing real-time perception of the surrounding environment and accurate identification of obstacles. The system adopts YOLOV7 algorithm to intelligently analyze road conditions and provide accurate and real-time navigation information for visually impaired users through voice and vibration feedback. The system is also equipped with gyroscope and microphone for monitoring the user's movement status and receiving voice commands to realize natural human-computer interaction. The accompanying smartphone APP connects to the system wirelessly via Bluetooth and provides voice and vibration alerts through the headset and built-in motor, providing a convenient user interface. The APP integrates GPS positioning and Baidu map service, which not only records the user's walking route in real time, but also intelligently plans the traveling route and provides voice navigation service. The system is well-designed, integrating advanced technology and humanized interaction, aiming to provide an innovative, reliable and easy-to-use guide solution for the visually impaired. 

KEYWORDS

YOLOV7 algorithm, multi-sensor information fusion, human-computer interaction, cloud service platform

CITE THIS PAPER

Hongyan Hou, Yaqi Zhang, Danni Man, Research on the design and realization of interactive wearable blindness guidance system based on computer vision. Advances in Computer, Signals and Systems (2025) Vol. 9: 100-109. DOI: http://dx.doi.org/10.23977/acss.2025.090114.

REFERENCES

[1] Khan M A, Paul P, Rashid M, et al. An AI-based visual aid with integrated reading assistant for the completely blind [J]. IEEE Transactions on Human-Machine Systems, 2020, 50(6): 507-517.
[2] Dang T V, Bui N T. Obstacle avoidance strategy for mobile robot based on monocular camera[J]. Electronics, 2023, 12(8): 1932.
[3] Wang Q, Meng Z, Liu H. Review on Application of Binocular Vision Technology in Field Obstacle Detection[C]//IOP Conference Series: Materials Science and Engineering. IOP Publishing, 2020, 806(1): 012025.
[4] Afif M, Ayachi R, Said Y, et al. Deep learning based application for indoor scene recognition[J]. Neural Processing Letters, 2020, 51: 2827-2837. 
[5] SLai Y, Ma R, Chen Y, et al. A pineapple target detection method in a field environment based on improved YOLOv7 [J]. Applied Sciences, 2023, 13(4): 2691. 
[6] Dourado A M B, Pedrino E C. Towards interactive customization of multimodal embedded navigation systems for visually impaired people [J]. International Journal of Human-Computer Studies, 2023, 176: 103046.
[7] Ye M, Yan X, Jiang D, et al. MIFDELN: A multi-sensor information fusion deep ensemble learning network for diagnosing bearing faults in noisy scenarios [J]. Knowledge-Based Systems, 2024, 284: 111294.

Downloads: 38554
Visits: 697968

Sponsors, Associates, and Links


All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.