In a groundbreaking development poised to reshape assistive technology, researchers have introduced a smart walking stick designed to enhance the independence of visually impaired individuals who speak Yorùbá. The innovation, spearheaded by Abisola Olayiwola of the Department of Computer Engineering at Olabisi Onabanjo University, integrates advanced machine learning and cloud computing to create a device that detects obstacles and communicates in the user’s native language.
The smart walking stick addresses a critical gap in current assistive technologies, which often overlook the linguistic and cultural needs of specific communities. “Existing solutions do not address the specific needs of individuals who speak Yorùbá,” Olayiwola explained. “Therefore, there is a need to develop a smart walking stick that can detect obstacles and communicate in Yorùbá.”
The development process began with the creation of an object detection dataset, featuring annotated images of common obstacles labeled in Yorùbá. This dataset ensured the cultural and linguistic relevance of the device. A Convolutional Neural Network (CNN) was then trained using this dataset to achieve precise obstacle detection and classification. The trained model was deployed to Render’s cloud server, leveraging advanced computational resources for efficient processing. The final stage involved integrating the trained model with the ESP32, a low-cost, low-power system on a chip with built-in Wi-Fi and dual-mode Bluetooth.
The smart walking stick demonstrated impressive performance metrics, achieving an accuracy of 0.8969, a precision of 0.9110, a recall of 0.9915, and an F1-score of 0.8969 in obstacle recognition. The device offers approximately 6.23 hours of continuous use on a full battery charge, making it practical for daily use.
This research, published in the Journal of Electrical Systems and Information Technology, highlights the potential of integrating cloud-based machine learning into assistive devices. The study not only aims to significantly impact the lives of visually impaired individuals but also contributes to the advancement of assistive technology and promotes cultural inclusivity. It opens up opportunities for language learning and engagement, fostering a more inclusive society.
The implications of this research extend beyond assistive technology. The integration of cloud-based machine learning and low-power devices like the ESP32 could inspire similar innovations in other sectors, including energy management and smart infrastructure. As Olayiwola noted, “This work demonstrates the viability of integrating cloud-based machine learning into assistive devices for visually impaired users.”
The smart walking stick represents a significant step forward in creating more inclusive and accessible technologies. By addressing the specific needs of Yorùbá-speaking individuals, this research sets a precedent for future developments in assistive technology, ensuring that technological advancements are accessible to all, regardless of language or cultural background. As the energy sector continues to evolve, the principles demonstrated in this research could pave the way for more efficient and culturally sensitive solutions, ultimately benefiting a broader range of users.