Stem II

STEM II is the main focus of February-May in STEM class at Mass Academy. The STEM II project is similar to STEM I in its longevity; however, this project is focused specifically on the field of Assistive Technology. Rather than working individually, we also collaborate in groups of 3 or 4 (or even 7). Throughout the process, you gain the experience of communicating with a client, working in a group setting on a long term project, and learning new technical skills!

This year, my initial group was allowed to pair up with another group to form the Super Group. Both of our groups had chosen the same initial project concept, and we found that both of our proposed solutions to the problem were almost the same. By jointing forces we were able to take on a larger amount of work and put more focus and attention to our prototype. My group was composed of David, Bella, Aaron, Diego, Garyth, and Charlotte.

Problem Statement: A visually-impaired manual wheelchair user currently struggles with navigation in novel environments.

Engineering Goal: The goal of this project is to design an attachable device for a wheelchair that allows a client with a visual impairment to navigate novel environments with increased safety and ease.

Initially, we had considered using either sonar sensors, LiDAR sensors, or a camera. We initially planned to select one detection method and and work from there. However, for our final project design, we opted to use a combination of both sonar and LiDAR sensors in order to effectively detect obstacles in the path of the wheelchair. We positioned the sonar sensors on each footrest of the wheelchair, as we deduced that this would be the most optimal to detect ground-level obstacles. Additionally, we implemented a LiDAR (Light Detection and Ranging) sensor in order to detect altitude drops, and assist with 3D mapping of a given environment. The LiDAR sensor is mounted on a bar which connects the two handles. PVC pipe elevates the LiDAR to allow it to scan a room without interference from the user’s head.

Our final prototype

The feedback to the user is provided through two vibrating wristbands. Each band corresponds with a side (left or right) and is controlled based on the data collected from one of the two sonar sensors and the LiDAR sensor. When either sensor detects an object, it triggers current to flow to the wristband. This allows for the user to feel the vibration on a specific side and react to it by steering away from it.