Evaluating the New Iowa City Food Truck Ordinance

April 1, 2015.

Iowa City is slated to expand food truck operations through a new city ordinance that will hopefully be adopted (story). But, the question is where can food trucks operate safely given these new guidelines?

First, I found a mostly comprehensive list of restaurants in Iowa City since food trucks are prohibited from operating within 150 feet of a restaurant. This list doesn't include KFC, McDonalds, or similar type establishments.

Getting Restaurant Locations and Geocoding


We'll need BeautifulSoup, urllib2, and geocoder to parse open and parse the HTML data, all of which can be installed using pip.
  pip install library
#import libraries
import BeautifulSoup,urllib2,timer,geocoder,csv

In taking a look at the HTML from the 'iowacitymenus.com' page, we find that it's a (mostly) organized list. So, we can use BeautifulSoup to parse the HTML data using python (there's a great tutorial by Zev Ross about using BeautifulSoup to parse HTML in python here.) We can use the urllib2 library to open the URL and feed it into BeatifulSoup. We then iterate through the "soup" to find the elements we're interested in and write the output to a CSV. Viola!
url = 'http://www.iowacitymenus.com/'
page = urllib2.urlopen(url)
content = urllib2.urlopen(url).read()
soup = BeautifulSoup(content)

This was my first foray into scraping web data, so I wasn't able to navigate the HTML tree structure, or use BeautifulSoup very efficiently. Instead, I used some elementary string manipulation to pull the names and address of locations and write them to the CSV.

with open('ICEats.csv', 'wb') as csvfile:
  writer = csv.writer(csvfile, delimiter=',')
  writer.writerow(('Name','Address','Lat','Lon'))
  for loc in locs:
      if loc.b == None:
          pass
      else:
    #The addresses were in list elements, but housed within break elements -
    #makes pulling the necessary informaiton a bit difficult. So, some
    #string  manipulation was needed to get the needed info.
    #----------------------------------------------------------------------
          name = str(loc.b).strip('')
          line = str(loc)
          s = line.find('br') + 5

          place = line[s:]
          e = place.find('br') - 1
          place = place[:e]

          #try to geocode locations
          try:
            timer.sleep(0.2) #sleep to avoid query limit
            loc = geocoder.google(place)
            Lat = loc.lat
            Lon = loc.lng
          except:
            Lat = None
            Lon = None

          dict[name] = place

          writer.writerow((name,place,Lat,Lon))
csvfile.close()

We've got our data into a CSV now. It wasn't perfect, I had to manually change a couple entries. Locations that couldn't be geocoded while writing to the CSV were just manually entered since there were only a few of them.

Spatial analysis of data and conversion for web display


ESRI ArcMap was utilized to do the heavy lifting for analyzing the spatial data. The majority of the hang up was the fact that Iowa City lacks an overlay of zoning layers. Individual parcels are available with a zoning classification, but this was less than ideal (for reasons that will become apparent shortly)

Processes done in ArcMap

Creating a 150 buffer around restaurants is straightforward, and uses readily available geoprocessing tools in ArcMap. The downtown districts were manually created using the Iowa City Zoning Map (warning, it's a HUGE PDF!)

A simple zoning overlay was all that was needed. However, only a parcel level Shapefile was available. Parcels do not lend themselves to dissolving by zoning type well since all the of the road right-of-way is essentially empty.

In order to get around this issue the road areas were isolated and split using road centerline data (available here).This data was merged with the parcel data that had been previously dissolved by zoning type, and a neighbor table was generated. From this, a zoning classification was assigned to each roadway based on the zoning type of the largest area bordering each road. The code for this was written hastily in the python interpreter inside of ArcMap...so didn't save it.

A little bit of manual cleaning of boundaries, and the data is ready for conversion. Converting ESRI Shapefiles to geo JSON is easy using OSGEO in the command prompt

  $ ogr2ogr -f GeoJSON -t_srs crs:84 [name].geojson [name].shp

The GeoJSON can then be dragged into MapShaper to be simplified and exported as a TopoJSON, which is much small and loads faster in web pages. Here's the final result

Displaying data for the web


D3 is an incredibly powerful library for displaying data - particularly spatial data.