During D Term CS, I worked with Anne Tie and Nihitha Reddy to develop an app titled LaunchGuide. We worked with Jack Peacock, a partner of the WalkFit program designed to help increase awareness surrounding visually impaired people (VIPs). We specifically identified an issue with crosswalk signals.
Individuals who are blind or suffer from low vision, hereinafter referred to as VIPs, are usually fearful of venturing out on their own due to the environmental obstacles present outdoors. When VIPs do venture out, they often encounter many obstacles, especially when crossing the street. When crossing intersections, often they have crosswalk signals that display a white walking man or a red hand to indicate whether it is safe to cross. However, these signals create challenges for VIPs as they often lack audible information that tells VIPs their status with respect to crossing the street. Since it is imperative to improve accessibility, our project aimed to create an app that assists VIPs when crossing the road by providing auditory signals.
Globally, at least 2.2 billion people have a near or distant vision
impairment (Why Accessibility Is Important | National Center on
Deaf-Blindness, n.d.). Those who are visually impaired experience
partial or complete loss of vision that causes problems not fixable
by conventional devices, such as glasses (Blind vs. Visually
Impaired, 2019). Today, as intersections become more and more
congested, signaling schemes often follow suit and become more
complex. In the past, the standard design parameters were based on
an able-bodied person, or one with good vision, hearing, and
mobility (Accessible Pedestrian Signals: Understanding How Blind
Pedestrians Cross at Signalized Intersections, n.d.). Unfortunately,
these systems often fail to provide sufficient non-visual
information for crossing decisions by VIPs. This is due to the
information barriers that restrict an individual’s ability to
recognize and receive information from their surroundings. More
importantly, it prohibits them from using this information to decide
on a course of action. As 26 percent of all Americans have some type
of disability, and with that percentage increase, it is imperative
to require design parameters to meet the needs of all pedestrians
(CDC, 2023).
VIPs can travel and cross streets with human
guides, white canes, guide dogs, and many other methods. Regardless
of the aid that is used, street crossing often includes many steps.
First, the pedestrian must determine if they have reached a street.
Often, they use a combination of cues, such as the curb or slope of
the ramp, traffic sounds, and detectable warnings. Next, they must
recognize the exact street they have arrived at. This information is
not typically provided in an accessible format, so VIPs must develop
a mental map or seek assistance from other pedestrians. If the VIP
has identified that they have come to an intersection, they must
obtain critical information about intersection geometry, which is
comprised of the location of the crosswalk, the direction of the
opposite corner, the number of intersecting streets, the width of
the street to be crossed, and whether there are any islands or
medians in the crosswalk. In order to determine these details, they
listen to vehicular sounds, traffic patterns, and search the
sidewalk area for poles with pushbuttons (Can I Cross the Street?
Considerations for a Blind Pedestrian | NADTC, n.d.). Unfortunately,
it has become difficult to determine the type of traffic control at
intersections that may fail to access the pedestrian push button and
crossing at times other than the pedestrian phase. After determining
the layout of the intersection, aligning to face toward the
destination curb, determining that the intersection is signalized,
and having pushed a button (if available), VIPs must recognize the
duration of the walk interval. Ultimately, VIPs have many problems
at these intersections. Since many only provide limited auditory
signals, it can be difficult for a VIP to determine whether it is
safe to cross, how much time they have to cross, or where the other
side of the road even is.
It is imperative for ubiquitous
appliances, such as pedestrian crosswalks, to be accessible.
Accessibility is the concept of whether a product or service can be
used by everyone however they encounter it (Why Accessibility Is
Important | National Center on Deaf-Blindness, n.d.). Not only does
accessibility provide vital parts of user experience design, it also
often benefits all users. These crucial statements have also been
reiterated through the Americans with Disabilities Act of 1990.
Specifically, the law outlines that the WALK/DON’T WALK cues from
the visual pedestrian signal heads should be conveyed to pedestrians
who don’t have visual cues (Barlow, 2009).
With
this in mind, we developed an app that would be able to aid VIPs
when crossing intersections that have visual signals.
Working with Mr. Peacock, we quickly cemented the issue that when
VIP’s try to cross the street, they often lack enough details to feel
confident crossing alone. Because of this, we developed an app that
scans the user’s surroundings in real time. It utilizes a light-weight
on-device binary image classification model that is able to scan and
identify crosswalk signals such as the white person and the red hand.
If nothing is detected, the app doesn’t return anything. However, if a
signal is detected, the app will alert the user. If a walk signal is
detected, it will return a “ding” sound, letting the user know it is
safe to cross. However, if a don’t walk signal is detected, the app
will return a “wait” sound to let the user know they should wait.
Using Android Studio, we first implemented the camera function using
the CameraX API. This API accessed the user’s camera to provide the
app with a stream of images to be analyzed. Specifically, we used the
CameraX Image Analyzer. From there, the images are passed into our
machine learning model that was programmed in Python with Sci-Kit
Learn and Keras, and then converted into a Tensor Flow Lite model for
use with Android Studio. If the model returns a confidence of 80% or
above, the app will alert the user. If a walk signal is detected, the
main screen will prompt the user to walk, as well as provide an
auditory alert. The same is done for when a do not cross signal is
detected.
This was our minimum viable product. However, we
also implemented other pages and features for user quality of life,
since we particularly wanted to emphasize accessibility. Firstly, our
app had two themes: a dark theme and a light theme that followed the
user’s system preferences. We made sure to use high-contrast and
VIP-friendly color palettes to allow users with limited vision to
still navigate through the app. In addition, we implemented an
intuitive and user-friendly way to ask for permissions, ensuring that
at no point would a user solely relying on audio get lost.
Furthermore, our app is fully compatible with TalkBack, the Android
feature that allows for text on screen to be read aloud. Upon opening
with TalkBack, all text is automatically read, and buttons have
appropriate and intuitive descriptions. Finally, we implemented help
screens that tell the user exactly what to expect from the app.
While learning Java, we had labs which were assignments where we had to use our knowledge and skills to create a program to perform a specific action and show proficiency. This ranged from calculating taxes based on different incomes, to drawing an illusion. This lab was to allow us to show our knowledge of loop control structures and the Graphics class. For the line art lab, we were tasked to draw straight lines in a rectangle connecting two adjacent sides. These lines were to be drawn so that the starting points and ending points were equidistantly spaced along their respective sides. Though each line was a straight line, the end result looked like a curve, since there were so many lines. For even more of a challenge, we also had an option to draw a smaller version of the rectangle inside our first rectangle. Since so many lines had to be drawn, the process had to be automated using a loop control structure, such as a for loop. This loop essentially keeps repeating a task until a certain condition is met. In this case, the task was to draw a straight line, and then move a set distance, and then repeat until it reached the end of the rectangle. In addition, I added extra code that slightly changed the color each time. Here is the result. I especially love how the colors turned out.
Aside from labs, we also did many smaller exercises to use our knowledge to code various tasks. One such task was to code a program that would draw ten different stars of random sizes in random places. This was done using a combination of algebra, trigonometry, and geometry, as well as static arrays in Java. A static array is essentially a list that holds a sequence of values. In this case, this was the x and y values of each vertex of the star to draw the shape using a polyline. First, a random center point for the star was generated, and then a random radius. From here, the locations of the rest of the vertices were calculated, and the star was drawn. This was repeated ten times. The end result changes every time the program is run, especially because I decided to make each star a random color as well. However, this shows what running the program might look like.