top of page

Projects Portfolio

iOS App for Rashal Energies

Developed an iOS app using Swift and the MVVM architecture to enhance the purchasing experience for gas in Tanzania, where mobile money serves as the primary payment method through cellular carriers.

Key Features:

Mobile Wallet Integration: Enables users to add mobile money to their Rashal accounts, pay for gas, and transfer funds to other wallets seamlessly.

Sales Order System: Designed for companies to prepay for large gas orders. Generates QR codes for drivers, ensuring accurate and efficient order fulfillment.

Customer QR Codes: Allows individual customers to prepay for gas deliveries, share prepaid gas with others, or facilitate large deliveries with a unique QR code system.

Rewards Program: Automatically calculates and applies user rewards, redeemable for future purchases.

Click here to see screenshots of the app in use.

Simulator Screenshot - iPhone 16 Pro Max - 2024-12-28 at 08.48.20.png

MineSweeper Agent

Unlike purely deterministic games such as chess, Minesweeper combines hidden information with probabilistic elements, requiring solvers to handle a great deal of uncertainty.

Although the optimal way to play minesweeper has been discovered to always choose the safest tile, in this project Hluemelo Notshe, Joshua Martinez, and I introduced a second goal of move minimization. Our agent therefore must weigh the probability of success in as few moves as possible against the risk of missteps, demanding strategic depth and decision-making flexibility.

See our presentation here 
And code here

Minesweeper Project

Chunreal - Visualizing FFTs

Using the Chunreal Engine plugin on Unreal Engine, I wrote chuck code that would display FFTs that respond to whatever music/sounds are outputting from your computer. 

Find the presentation and code here

Screenshot 2025-01-07 at 11.44.52 AM.png

Stanford Laptop Orchestra Performances

Leo Jacoby, Matan Abrams, and I used the programming language Chuck and Open Sound Control (OSC) connections to multiple other performers to create a live performance involving gametrak controllers , spatialization, and granular synthesis.

Find the code here and performance here

 
 
 
Slork.jpg

Sign-Language Assistant (Beta)

 

American Sign Language (ASL) is currently an inaccessible, ‘accessible’ language. There is a clear lack of engaging online resources that can effectively teach American Sign Language. To improve this, Josh Martinez, Luca Wheeler, and I made a model that can classify the ASL alphabet in real time from users' webcams. We did this by training a small convolutional neural network that can run on a personal computer.

To make the platform more engaging, we deployed the model in a game to increase accessibility across ages and allow for a more enjoyable learning experience. 

Find our write-up and code here, and our presentation/demo here

​

Screenshot 2025-01-07 at 12.17.25 PM.png

Face2Spotify
 

A program that sends API requests to Spotify depending on the different motions you perform in front of your camera. Doing nothing continues the song as usual, making a stop sign with your hand will pause/play the music, pointing with your right hand to the right will skip the song, pointing with your left hand to the right will fast-forward the song, and dancing will play a random song from a disco playlist. Here is the demo (Code is in the description)

Face2Spotify

Beatboxing a beat

 

In this project, I created a program that listens to input from your mic in real time and plays back a sample of the closest-sounding percussive instrument. I did this by extracting 4 different features (centroid, flux, rms, and mfcc) from 15 different recordings of me making beatboxing sounds (mosaic2.ck). I then mapped those feature extractions to WAV files of real percussive instruments (Kicks, snares, high hats, toms, etc.) in real-time (Mosaic-mic.ck). The code and video can be found here

Screenshot 2025-01-07 at 12.44.00 PM.png

The wheel

My friend Leo Jacoby and I built an instrument from scratch out of a bike wheel and an Arduino. All sounds are coded using ChucK and Python, and our three controllable parameters were the instrument's gain, distortion, and pitch. We controlled gain by placing magnets along the center spire of the wheel, and a hall sensor detected each time a magnet came close to detect the spin rate. Pitch can be changed based on the forward/backward tilt of the instrument and distortion based on the right/left tilt, detected by an accelerometer. Here's the video demonstration

 
The Wheel
bottom of page