Mapping Bed-Stuy

For my Live Web midterm and Everything is Physical final I worked with Nick Bratton on an interactive map of my neighborhood in Bed-Stuy, Brooklyn.

The map was inspired by a map that I saw in mapping class, Gangs and Cupcakes.


I was really impressed by how this simple map said so much about the reality in the neighborhood. It made me start thinking differently about my own neighborhood. I live in Bed-Stuy, which is a rapidly gentrifying neigborhood. My apartment, is located right across the street from the beginning of two and a half blocks of HUD housing. One of my friends who lived on the block before me told me when I moved in “just don’t walk left.” In other words, the neighborhood is fine, just don’t walk through the HUD housing. And I’ve pretty much followed that rule for the two and a half years that I’ve been living here.

This map, and the mapping class in general, got me thinking much more about my relationship to the parts of my neighborhood that I really know nothing about. One day I was walking home from the subway and I noticed a party happening in the barbershop that I never notice. I realized that the barbershop isn’t even on my mental map of the neighborhood, but it’s obviously a really important part of the neighborhood to many other people. I got the idea to have people in the neighborhood draw their own mental map of the neighborhood and to overlay them. I thought this would be an interesting way to get a different perspective on the neighborhood.

In Live Web, Nick was interested in gentrification and mapping as well. And we had just learned how to save data with NeDB and node. So we decided we’d collaborate to make it happen. On Shawn’s suggestion, we chose to make an interactive map that people could draw on using an iPad and stylus. The maps would then be saved and viewable to anyone viewing the site.

I designed the map in Mapbox Studio. It was important that the drawing canvas that we overlaid on the map would be exactly the same size as the base map so that we could overlay the resulting drawing over the map. We originally used a full screen map that went beyond the bounds of the screen with navigation arrows (ie. say screen is 700×400, our map and canvas would be 1200×900 and you could use arrows to bring hidden parts of map on screen), but decided to simplify and just put the whole map at a fixed size on the visible screen to start.


Nick built the canvas drawing functionality. I built the save to png functionality using a library called reimg and the database to store and display user-generated maps using NeDB.

We just had the basic functionality setup for our Live Web demo, but I wanted some actual content for my mapping class final. I got a stylus for the iPad, the Sensu Artist Brush, as the ER at ITP didn’t have anything that would work on iPad (the standard Wacom stylus doesn’t work on iPad!).

When I got the stylus, I realized that the canvas mouse commands didn’t actually work on iPad, so I rebuilt the canvas interactions for touch. Unfortunately, once I got that working I realized that drawing on the iPad touch screen doesn’t give great drawing results. And writing looked absolutely awful. Here’s an example:


So I built some quick and dirty labeling functionality (which currently only works on touch):


Then I went out to my neighborhood coffee shop. I gave myself an hour, and surprisingly only had time to talk to two people in that time because they both started telling me their stories while drawing.

I really loved what Zachery Allan Starkey drew. As he was drawing he told this amazing story about he came to New York with $2000 in his bank account years ago to be a musician, and had just opened for New Order at Radio City Music Hall the night before. How I wish I was recording him! Here’s his map:


Next Steps

So, it’s really just a prototype for now. If we were to develop it further, here’s what next steps would be:

  • Better input: iPad Pro and iPencil
  • Test and refine UI
  • Refactor and comment very messy code
  • Integrate existing tools?
  • Record live movement and audio (great idea from Shawn!)
  • Decide on meta data to collect (ie demographic)
  • How does it work with more data?
  • Can it be white labelled for classroom use, etc?

Demo and Code Links

You can play with it live here: (remember, labeling functionality only works on iPad for now)

You can see the project on Github at