Skills: Unity, C#, ARFoundation

Download Unity Project
View Unity Scripts on Github
Download GeoXplorer from App Store

GeoXplorer is a mobile app that enables users to “Xplore” a wide range of 3D models from Earth, the Moon, and Mars using Augmented Reality.

GeoXplorer features a catalogue of different 3D models you can pick to view in AR.These models are split into four categories: Martian Models, Crystal Lattice, Digital Elevation Models, and Hand Samples. Once you select and download a model, you can point and tap anywhere on the world around you to view the 3D model in AR.

If the model is too small or maybe not oriented correctly, GeoXplorer offers controls that allow you to manipulate the size, elevation, and rotation of the model.

GeoXplorer is currently only available for iOS. Android support coming soon!



Skills: JavaScript, React,React Router, Axios, FormData, three.js, Heroku

Visit GeoBase

GeoBase is a React app that planetary scientists use to browse, download, and upload 3D models. Using three.js, GeoBase loads high-fidelity 3D models directly into your browser window, allowing you to drag and rotate the models to examine them. Each 3D model is around 30MB, which presented a considerable challenge when trying to view models in your browser without crashing anything. Because JavaScript is single-threaded, I employed the use of Web Workers to load the 3D models on a background thread so the UI doesn't freeze. Uploading large 3D models also presented another challenge as file uploads would often become corrupted. I solved this issue by using the NodeJS File system API ('fs') to break up big files into smaller packets using streams.


Backend API for Fossett Lab

Skills: JavaScript, Node.js, Express, Azure Blob Storage, Formidable, Node.js filesystem ('fs')


The Fossett Lab for Virtual Planetary Exploration has nearly one terrabyte of high-fidelity 3D models of Earth and Martian terrain. I built out the entire backend infrastructure for the Lab, moving all 3D models to Azure Blob Storage. I then created a REST API that allows us to POST and GET 3D models from the Fossett Lab's different products: GeoXplorer (for iOS and Hololens) and GeoBase. This swiss-army knife approach allows the Lab to scale its operations and not have to worry about serving models locally.

This is a Node.js API that uses express to handle routing, azure-storage to interface with Azure's Blob Storage service, streams to reliably move files, and formidable to handle folder uploads to Azure.

3D models stored in Azure Storage are organized into different folders, depending on the type of model being uploaded. Primarily, we have endpoints for each type of category: Digital Elevation Models, Martian Terrain (Hi-Rise) models, hand samples, and crystal lattice structures.


Mad Dash

Skills: Unity, C#, NavMesh AI Agents

Play Game
Download Unity Project

Mad Dash is a Unity-made cops and robbers style game. The player assumes the role of a robber trying to avoid the cops while making their way to the safehouse. Along the way, you can collect powerups (such as explosive mines, bullets, invisibility cloak, and teleportation) to aid in your escape. The police in this game are Unity NavMesh AI Agents that will intelligentially start chasing you if they spot you. Mad Dash is composed of 3 levels that you can unlock by getting to the safehouse.


Visualizing gentrification in NYC

Skills: JavaScript, d3.js, GeoJSON

See Visualization

Using the d3.js visualization libray, me and two friends made a visualization tool that educates people about the effects of gentrification in NYC over time. We focused on identifying 6 key factors that lead to gentrification: Housing supply, Commute time, median household income, median rent, population, and racial diversity index. We use these factors to calculate a gentrification 'score' that influences the color shades of each subborough in the visualization. Darker subboroughs are more gentrified compared to lighter ones.

In addition to seeing the effects of gentrification over time, users can compare 2 subboroughs at a time and select specific attributes to analyze. Doing this creates side-by-side bar charts that displays fine-grained data from past years.



Skills: Python, Google Assistant SDK, Google Cloud Vision API, Raspberry Pi


The "Capsistant" is a smart hat equipped with a Raspberry Pi that uses a camera, microphone, and Google Assistant SDK to tell users what objects are in front of them using the Google Cloud Vision API. Simply ask "What am I looking at?" to trigger the Capsistant!