Foodie Vision : a yolo-based portable device for grocery food identification and audio feedback for visually impaired 6

By: Drice, Andrei James Capili, Perez, Kelvin Edelson Dimabuyu, Rubas, Angelo Chico, Velasco, Alexander Ryan Siao, Zafe, Melvin John Drilon 4 0 16, [, ] | [, ] |
Contributor(s): 5 6 [] |
Language: Unknown language code Summary language: Unknown language code Original language: Unknown language code Series: ; 4541346Edition: Description: Content type: text Media type: unmediated Carrier type: volumeISBN: ISSN: 2Other title: 6 []Uniform titles: | | Related works: 1 40 6 []Subject(s): -- 2 -- 0 -- -- | -- 2 -- 0 -- 6 -- | 2 0 -- | -- -- 20 -- | | -- -- -- -- 20 -- | -- -- -- 20 -- --Genre/Form: -- 2 -- Additional physical formats: DDC classification: | LOC classification: | | 2Other classification:
Contents:
Action note: In: Summary: ABSTRACT: STATEMENT OF THE PROBLEM: The project FoodieVision, a portable device aims to empower visually impaired individuals to independently identify various grocery food items. This user-friendly device seamlessly recognizes and labels food items, providing clear audio feedback to aid in making informed dietary choices. Currently, visually impaired individuals face challenges in identifying foods independently, leading reliance on others for meal information and potentially compromising their health. To address these challenges, the researchers seek to develop comprehensive software and hardware solutions, leveraging YOLO-based technology for food recognition. This research aims to provide answers to key questions such as the development process, dataset methods, implementation of audio feedback, accuracy metrics, an user testing approaches for FoodieVision. RESEARCH METHODOLOGY: FoodieVision, involves a descriptive research design for FoodieVision's development, leveraging the quantitative descriptive research at its research design. Participants from the Center for Advocacy, Learning, and Livelihood Foundation of the Blind provides insights. Hardware components include Raspberry Pi 5 and a Mini Speaker Module. The System Usability Scale (SUS) evaluates usability. FoodieVision's system architecture includes real-time image capture, YOLO algorithm-based object detection, and audio feedback generation. Data gathering involves image preprocessing, YOLO model training, and quantitative analyses. Ultralytics Softwareaids in metric gathering, while SUS scores assess user satisfaction. Overall, the methodology aims to enhance FoodieVision's usability for visually impaired infdividuals. SUMMARY OF FINDINGS: The researchers developed a portable prototype for FoodieVision, using a 3D rendered model as the basis. Challenges included managing space and heat within the device, requiring strategic component placement. The speaker is positioned at the top for directed audio feedback, while the camera is placed on the cover for a wide view. Buttons are easily accessible and tactile for the visually impaired. FoodieVision's compact form factor enables portability, with audio feedback aiding in product recognition. The dataset analysis reveals common grocery items and their sizes, crucial for detection. Training results show that while YOLOv8s performs better in accuracy, YOLOv8n is more stable. Model validation on Raspberry Pi 5 indicates YOLOv8n excels in PyTorch and ONNX formats, while YOLOv8s slightly better. System Usability Scale scores suggest high usability, The SUS score of 81.32 suggests that the evaluated system, according to respondents, performed well in usability, indicating its effectiveness in aiding visually impaired individuals. CONCLUSION: The research teams aims to foster independence for visually impaired individuals by creating an inclusive society. They achieved their objectives by developing FoodieVision, a Portable Device for grocery food identification, integrating a precisely trained YOLO algorithm model. The team evaluated system performance through various metrics and conducted user testing to enhance effectiveness and usability. RECOMMENDATION: To enhance the FoodieVision prototype, the following improvements are recommended based on user feedback: Consult Braille experts to improve signage readability. Reduce the device's size and weight for better portability. Expand the grocery item dataset, considering various environmental conditions during image collection. Use advanced speech synthesis, like Google Cloud TTS, for clearer audio feedback. Integrate battery monitoring for real-time status alerts. Collaborate with retailers to add a barcode reader and connect with grocery stores for real-time pricing and product details. Focus on identifying non-perishable items for consistent reliability. Include memory capacity alerts to maintain efficiency. Provide audio alerts for food expiration dates to ensure safety. Consider adding features like money reader and detailed product information. Other editions:
Tags from this library: No tags from this library for this title. Log in to add tags.
    Average rating: 0.0 (0 votes)
Item type Current location Home library Collection Call number Status Date due Barcode Item holds
Book PLM
PLM
Filipiniana Section
Filipiniana-Thesis T TK7558 .d75 2024 (Browse shelf) Available FT7926
Total holds: 0

Undergraduate Thesis : (Bachelor of Science in Computer Engineering) - Pamantasan ng Lungsod ng Maynila, 2024 56

5

ABSTRACT: STATEMENT OF THE PROBLEM: The project FoodieVision, a portable device aims to empower visually impaired individuals to independently identify various grocery food items. This user-friendly device seamlessly recognizes and labels food items, providing clear audio feedback to aid in making informed dietary choices. Currently, visually impaired individuals face challenges in identifying foods independently, leading reliance on others for meal information and potentially compromising their health. To address these challenges, the researchers seek to develop comprehensive software and hardware solutions, leveraging YOLO-based technology for food recognition. This research aims to provide answers to key questions such as the development process, dataset methods, implementation of audio feedback, accuracy metrics, an user testing approaches for FoodieVision. RESEARCH METHODOLOGY: FoodieVision, involves a descriptive research design for FoodieVision's development, leveraging the quantitative descriptive research at its research design. Participants from the Center for Advocacy, Learning, and Livelihood Foundation of the Blind provides insights. Hardware components include Raspberry Pi 5 and a Mini Speaker Module. The System Usability Scale (SUS) evaluates usability. FoodieVision's system architecture includes real-time image capture, YOLO algorithm-based object detection, and audio feedback generation. Data gathering involves image preprocessing, YOLO model training, and quantitative analyses. Ultralytics Softwareaids in metric gathering, while SUS scores assess user satisfaction. Overall, the methodology aims to enhance FoodieVision's usability for visually impaired infdividuals. SUMMARY OF FINDINGS: The researchers developed a portable prototype for FoodieVision, using a 3D rendered model as the basis. Challenges included managing space and heat within the device, requiring strategic component placement. The speaker is positioned at the top for directed audio feedback, while the camera is placed on the cover for a wide view. Buttons are easily accessible and tactile for the visually impaired. FoodieVision's compact form factor enables portability, with audio feedback aiding in product recognition. The dataset analysis reveals common grocery items and their sizes, crucial for detection. Training results show that while YOLOv8s performs better in accuracy, YOLOv8n is more stable. Model validation on Raspberry Pi 5 indicates YOLOv8n excels in PyTorch and ONNX formats, while YOLOv8s slightly better. System Usability Scale scores suggest high usability, The SUS score of 81.32 suggests that the evaluated system, according to respondents, performed well in usability, indicating its effectiveness in aiding visually impaired individuals. CONCLUSION: The research teams aims to foster independence for visually impaired individuals by creating an inclusive society. They achieved their objectives by developing FoodieVision, a Portable Device for grocery food identification, integrating a precisely trained YOLO algorithm model. The team evaluated system performance through various metrics and conducted user testing to enhance effectiveness and usability. RECOMMENDATION: To enhance the FoodieVision prototype, the following improvements are recommended based on user feedback: Consult Braille experts to improve signage readability. Reduce the device's size and weight for better portability. Expand the grocery item dataset, considering various environmental conditions during image collection. Use advanced speech synthesis, like Google Cloud TTS, for clearer audio feedback. Integrate battery monitoring for real-time status alerts. Collaborate with retailers to add a barcode reader and connect with grocery stores for real-time pricing and product details. Focus on identifying non-perishable items for consistent reliability. Include memory capacity alerts to maintain efficiency. Provide audio alerts for food expiration dates to ensure safety. Consider adding features like money reader and detailed product information.

5

There are no comments for this item.

to post a comment.

© Copyright 2024 Phoenix Library Management System - Pinnacle Technologies, Inc. All Rights Reserved.