|
|
https://youtu.be/kUn3_E1ZmDY
The Visioneer project is attempting to solve the last mile problem for the visually impaired. Their glasses are based on the Raspberry Pi Zero 3D printed and based around the Raspberry Pi Zero. This project makes use of a number of Adafruit products to help detect, identify and alert a user to their surroundings. The combination of two cameras, lidar, bone conductor audio and machine learning to make predictions is a powerful combination shown in a number of prototype videos.
The Visioneer project is particularly exciting because it is an actively developing project based on the small form factor Pi Zero. A similar project After-Sight-Model-1 was using a version of the “The vOICe” for Raspberry Pi and Teradeep as a deep learning network. Sadly, after-sight appeared to stall two years ago when their application for charitable status was denied by the Canadian government.
Via HackaDay.io:
Schematic (First Draft)
Based on the usage flow diagram, we decided to use an accelerometer to determine if the user is walking or stationary. We use OpenCV to perform obstacle avoidance. To determine if the user wants to identify something at a close distance, we use lidar. If the user is stationary and isn’t close to any objects, OpenCV and a local neural net will identify surroundings to determine if the user is looking at traffic or other objects. Everything will operate on a Raspberry Pi Zero.
|
|
|
| |