{ Computer Science }

Computer Science is taught by Mrs. Taricco. This class has been a great way to develop our coding skills. We started by creating our very own website from scratch using HTML and CSS. For some of us, we had never used those two languages before, but by the end, we were pretty comfortable with them (practically professionals). Then, the intermediate CS kids started learning the basics of Java, everything from data types to if-statements and recursion. We did a lot of labs and exercises in the class to develop our Java skills. Currently, we are gearing up to start Apps For Good, a project where we develop an app to address a problem in the community, and the first ASCL competition.

For this assignment we used Applet which is a graphics class in Java. We were given a picture of a line design (see below) that we were then asked to replicate. The result is really cool, but the hardest part about the assignment was visualizing how in the world to draw it. Ultimately, it came down to the space between the lines on the x and y axis, but it took a lot of trial and error to find the right values.

Picture of my line art

We also used Applet for this assignment to draw the stars. This one was much easier to visualize, but much harder to make a reality. There was a bunch of math involved to find the coordinate points (lots of trigonometry), but then the hardest part was trying to figure out how to make sure the stars never overlapped (this wasn't a criteria for the exercise but it was an extra challenge a couple others kids included). That required a couple nested while loops, but the program’s functionality came down to the size of the stars. If the radii of the stars were too large, there was a higher chance of stars overlapping so the program took longer to run. At one point, I overloaded my computer with so many stars that it crashed, and I had to restart it! Again, it was a lot of trial and error, most of computer science is, but it is always so rewarding when the project you are working on works.

Picture of My Stars

Apps For Good

{ MediSign }

I worked with Armaan Priyadarshan and Jared Rosen, and together we are the PAJamas! We spent 2 months building and app for our Apps For Good project, and I am proud to present to you: MediSign!

Problem

The hearing community has difficulty communicating with people in the deaf community, especially in the medical field. While specialized medical offices and hospitals designed for deaf and hard-of-hearing individuals exist, these locations are often unavailable to deaf patients during times of emergency where the goal is for the patient to be brought to the nearest available medical clinic possible (and is often not one made for deaf people). The occurrence of this issue creates the desire for medical clinics and hospitals that are not specialized for deaf individuals to have services and tools in place to aid communication between medical professionals and deaf patients. Despite the existence of methods and technologies already in place to resolve this issue, such as interpreters, translation apps, online sign language dictionaries, and sign language courses, there are limitations to each option. These limitations cause there to remain a significant gap in the need for easy, fast, reliable, and affordable translations for doctors in emergencies. MediSign addresses the challenge of communication barriers between medical professionals and deaf or hard-of-hearing patients by offering streamlined access to American Sign Language (ASL) signs through a mobile app. By providing a curated list of common questions, symptoms, and responses sourced from reputable healthcare institutions, retrieving corresponding ASL signals from a Firebase database, and displaying them through sequences of videos, the app facilitates effective communication in medical settings. Leveraging technologies including React Native, Expo, and Firebase, MediSign ensures accessibility across Android and iOS platforms, enhancing the healthcare experience for both patients and providers.

Logo

MediSign Logo

Target Audience

Our target audience is doctors who work with deaf patients. However, as this phrase could imply a range of things, we decided to reach out to experts in the medical field, specifically those who work in deaf programs at hospitals to aid in the process of refining the scope of our project. Jessie Wilson, a medical professional at the Worcester Recovery Center and Hospital, who himself was deaf, responded and was able to give us valuable insight into our audience. 1, this app would be helpful in general care situations, and 2, this app would be useful for awkward conversations. Mr. Wilson’s information provided us with a real-world perspective on our project and valuable information to help us directly tailor our project towards our audience (J. Wilson, personal communication, March 19, 2024). After this conversation, we decided to focus on targeting our app toward facilitating communication for general-care providers, making these doctors the main audience for this app. However, there is a secondary audience for our app which are the deaf patients themselves. Although they may not be directly using this app, they are affected by its use just as much, if not more, than the doctors themselves. A deaf or hard-of-hearing patient could download and give it to a doctor whenever they feel the need, such as in a potentially embarrassing conversation where they might wish for an interpreter to leave the room.

The goal of this project is to help the hearing accommodate the deaf community instead of the reverse. There is a lot of sign-to-text and sign-to-speech technology already in existence, but the majority are centered around the deaf community accommodating the hearing community. After talking to Kim Stegbuchner, an ASL teacher, this was discovered to be one of the main drawbacks of ASL communication technology. She said, “I think there is too much emphasis on helping the Deaf community instead of helping us [accommodate them] and letting them be enough as they are.“ (K. Stegbuchner, personal communication, February 1, 2024). Our app is structured considering this feedback. When a doctor is interacting with the app, they should be able to search up signs for the word/question they need to say, watch the signs being displayed on the screen, and then sign it back to their patient all in under a minute. If this process takes too long, the conversation could become long and choppy, which can become frustrating for both patient and doctor. We are hoping that this quick “look-up and sign” process will act as a first step towards bridging the communication gap between the hearing general care doctor and the deaf patient.

Picture of HomeScreen

MediSign Home Screen

MVP

MediSign is a mobile app that aims to provide medical professionals with streamlined access to ASL signals to facilitate communication with deaf or hard-of-hearing patients during appointments. The app makes use of three main features to do so: presenting the user with a list of common questions, symptoms, and responses that might come up during check-ins, retrieving the corresponding ASL signals from a database once the user selects an option, and displaying the signals through a sequence of videos. Supported platforms include Android and iOS. Technologies utilized include React Native and Expo for the user interface and Firebase for database management.

Firstly, the user is given common questions, symptoms, and responses to choose from. The list of questions and symptoms is sourced online from Horizon Health Care, Nova Southeastern University, and the Royal College of Obstetricians and Gynecologists (Horizon Healthcare, n.d.; Nova Southeastern University, n.d.; Royal College of Obstetricians and Gynecologists, n.d.). The sample responses consist of yes, no, and maybe for the user to understand patient answers. Upon opening the app, the user is greeted with a menu of these three categories (questions, symptoms, and responses) to choose from, after which they will be redirected to a corresponding list of items in the category they chose. This menu is organized and formatted using React Native, with navigation functionality from React Navigation.

Once the user selects a category, the corresponding list for that category is read from Firebase. A Firebase Firestore database type was chosen for persistent storage to facilitate organization, scaling, and efficiency in code. In Firestore, the data is separated into three collections for each category. Each collection is a set of documents that contain values for the questions, symptoms, and responses collected with either a corresponding videoId for the ASL sequence or a reference image. After being retrieved with Firebase Modular SDK, the list of questions, symptoms, or responses is formatted using React Native, being displayed as an array of buttons for the user to choose from and obtain the ASL sequence.

After a question, symptom, or response has been chosen and retrieved from Firebase, MediSign displays the ASL sequence to show the user how to sign what they want to say. ASL sequences take the form of videos sourced online from the public domain on YouTube. These videos have IDs in their URL, which can be used to embed them in React Native. These IDs are stored in the videoId property of each document in Firebase. The react-native-youtube-iframe package uses the videoId value to display the video. At the same time the video is running, a reference image is displayed below the video for the general care provider to show the patient in case there are communication struggles. These reference images are also sourced from the public domain and correspond to the question, symptom, or response.

Process

We started by just doing preliminary research. We wanted to understand the scope of our problem we were addressing. At one point, we realized that the knowledge we were learning from online sources wasn’t enough; instead, we needed the expertise of someone who worked with deaf patients in the medical field. That’s when we reached out to Mr. Wilson. The feedback he provided steered us in the right direction as we made the list of the common questions, symptoms, and response we wanted our app to include. Once we had our list, we started working on building the app. Jared and I focused mainly on UI while Armaan did most of the back end. We decided to use FireBase to store all of our videos and reference pictures for each question/symptom/response. Armaan linked the FireBase while Jared and I found the videos and pictures and created the playlists (to display questions in ASL).

Everything was going well and we had an MVP pretty early. We set up a meeting time to test with Mr. Wilson in person, and of course, that was the day our app decided to break. We managed to get something running so Mr. Wilson could still test the app, but we spent the next 3 weeks troubleshooting. For some reason, our app would not run. Here is where I will give you some sage advice: Do not. Under any circumstance. Use ExpoGo. You may be tempted to. DO NOT fall into the trap. You’re welcome.

After a couple painful and frustrating weeks, we finally got the app to build on an Android tablet (thank you Armaan!) and we were able to test our app and present it at the fair. To be honest, I love our app. I think it is such a cool idea, and I hope that we continue to work and improve upon it because I do think it could be an app for good ;)

Sample of Our Testing log

MediSign Test Cases

MediSign Poster

Shout out to Jared and Armaan! I loved working with you guys. I think we make a pretty good team.

01000011 01101111 01101101 01110000 01010011 01100011 01101001 01101001 01110011 01100110 01110101 01101110

If you can translate that, I have major respect for you.