Computer Science

Mrs. Taricco

Computer Science is a class taught by Ms. Taricco. Throughout the course of the year, we complete three major projects: creating a personal website (what you are looking at right now), creating a software product, and creating a mobile application to help out the community. In order to carry out these projects, we use the coding platform to learn various programming languages. We started the year by learning CSS and HTML to structure and design screens. As we got deeper into the course, we began learning the fundamentals of java. We have covered many topics, including arrays, iterations, boolean statements, different data types, and testing the validation and accuracy of specific algorithms. In December, we began participating in the American Computer Science League (ACSL), a competition for high school students with four challenges.

Bulgarian Solitaire

The Bulgarian Solitaire exercise was part of our ArrayLists unit. The program starts with a triangular number of cards, for example, 45, and then plays Bulgarian Solitaire until there are piles of cards with the sizes 1, 2, 3, 4, 5, 6, 7, 8, and 9 in some order. While playing the game, you essentially take a card from each pile and create a new stack with those cards. The purpose of this program is to show that if the number of cards you have is a triangular number, you will end up with a configuration like the one in the example above. This was one of my favorite assignments to code because I could take skills from various units and apply them to one problem.

Stars

My favorite coding assignments that we have completed this year were the ones where we incorporated math. In the stars exercise, which was a part of the Static Arrays unit, we had to generate 10 stars of different sizes that were placed in different locations. In order to do so, I started by drawing out the stars on paper and soon realized that I could use mathematical concepts such as trigonometry to help. After I understood the math behind the stars, it was a lot of fun to turn it into code!

Apps For Good: LaunchGuide

Developers: Nihitha Reddy, Anne Tie, Amy Chen

Advisor: Jack Peacock, Angela Taricco

Our Team

family photo

Our team includes me (in the middle), Anne (on the left), and Amy (on the right). Although we are all in the same section, we became much closer in the process of making this app. We combined all of our strengths and had fun while doing so.

Community Need

LaunchGuide is an app that aids visually impaired people (VIPs) in walking across crosswalks. Most crosswalks do not make sounds for when the crosswalk light switches symbols, making it difficult for the visually impaired to find their way across.

Target Audience

LaunchGuide is an app that is tailored for those who are visually impaired. Since our app has two screens (the home screen and the detection screen), it is easy to navigate through and is user-friendly. We also incorporated a light and dark screen with highly contrasting colors: the light screen has a black background with a bright yellow button, and the dark screen has a white background with a dark blue button. This makes it easier for those who have less mild visual impairment to see.

The Solution: LaunchGuide

When developing an app for VIPs in Android Studio, there were key specifications to be considered. First, the app was designed with accessibility in mind, using features such as screen readers and voice commands to make it easier for visually impaired users to navigate. The app also used high-contrast colors and large fonts to improve visibility for users with low vision. Additionally, the user interface consists of standard logos and placements for directory buttons. By considering these specifications when developing a blind people-friendly app using Android Studio, our project aims to be accessible, compatible, and user-friendly for visually impaired individuals. In order to detect crossing signals, this app used an on-device image classification mechanism. Firstly, in order for the app to be effective, the images were detected and classified in real-time. Our app is able to passively take pictures of the user’s surroundings on five-second intervals. Finally, the output of the app has whether the person should walk or not and gives the accuracy with that classification.

App Architecture

When you first enter our home screen, you will see an introductory paragraph explaining our app’s purpose. This paragraph uses the Android TalkBack feature so that it will read the paragraph out loud to the user. We then take the user through permissions and then redirect them to the screen with a camera. The camera takes in a byte buffer as the input and passes it through an image classification machine learning model. All of the images that the model trained on include bounding boxes. The machine learning code was run on Google Colab, and the app code was run on Android Studio. Lastly, the app outputs whether the user should walk across the crosswalk or not.

Our MVP

The minimum viable product (MVP) of our app is comprised of an accessible user interface with a camera that captures images at five-second intervals and implements an on-device image classification algorithm.

Our Project Proposal

To learn more about our app, feel free to read through our project proposal!