Thursday, June 22, 2017

Week6: Preparing my MEDS dataset

This week is the first of a two-part lab centered around the Department of Homeland Security and disaster preparation. The focus of the first part of the lab is the Minimum Essential Data Set, or MEDS. MEDS is the basis behind the standards based geospatial model that the DHS developed so that government agencies would be able to share information between agencies in a timely manner during an emergency or security crisis.

The idea is to create standards and essential data sets for urbanized and large areas in order to create a foundation necessary for the U.S. security community to properly respond to an emergency situation. MEDS is created using data from various internet sources and are prepared in a large rang of formats, including shapefiles, rasters, tables, and geodatabases. The data collected conforms to specific themes and must be current to at least two years for urban area and current to five years for larger areas. The data collected falls under these categories:

Orthoimagery: High resolution aerial imagery.
Elevation: Digital Elevation Models, usually from LIDAR imagery, showing terrain elevation.
Hydrology: Lakes, rivers, spillways, dams, etc.
Transportation: Streets, railways, highway infrastructure, airlines, etc.
Boundaries: Political boundaries like State, County, Township, and possibly tax parcel information.
Structures: Important critical infrastructure
Geographic Names:Information about the physical and cultural geographic features, collected by GNIS

The data themes make up the backbone of the nations ability to respond to effectively to natural disasters, terrorist attacks, crisis, or any emergency involving federal or local government involvement.

This goal of this weeks lab was to prepare our data as it would be prepared if it were part of a DHS MEDS dataset. This is what my final geodatabase looked like.


To start we'll clarify that this database was prepared for a specific study area inside of Boston Massachusetts, specifically preparing for the response to the 2013 Boston Marathon bombing. Let's take this data layer at a time. I'll only be discussing layers that I made changes too, some were simply imported like the Hydrography, Orthoimagery and Elevation layers.

Boundaries: This data layer includes the Massachusetts county layer, the city of Boston and surrounding cities as well as a 10 mile buffer around those cities to create the study area. Aside from checking projections nothing was done to these layers.

Transportation: Initially the transportation layer contained a single feature containing all the roadways in the study area. To prepare it for my study the roads were joined to the CFCC code table, adding the CFCC road codes to each feature. I then separated out the roads I wanted by their CFCC codes and created separate feature classes for each one. As in the image above, the Local, Secondary and Primary road layers were pulled out of the original feature then symbolized and set to only display at a relevant extent to avoid clutter on the map.

Landcover: To symbolize the landcover raster appropriately I used a colormap issued by the National Land Cover Database, this gave me the appropriate coloration and labels for the raster.

Geographic Names: This feature was given to me as a text file containing the locations names and XY coordinate data. I created a point layer from the XY data, projected it to match the rest of the geodatabase and then selected and exported the points inside the study area for our map.



Tuesday, June 20, 2017

Geoprocessing using Python

Today I started running geoprocessing operations using only python scripting. The key to geoprocessing in arc is to set your workspace and environment options in the script, this cuts down on a lot of work later on. In this exercise I added X,Y location data to the map and then used that to create a buffer around those points, then dissolve the buffers into a single shape. With the workspace set to the data folder I don't have to repeatedly string out the filenames of the shapefiles I'm using, python will assume all my files come from and go to my workspace unless I specify otherwise. Using the syntax section of each tools help area I coded as follows.

program Mod6_AEdmundson.py
    import arcpy and env
    set workspace to data folder
    allow file overwrite

        Add X,Y data to "hospitals.shp"
        return tool messages
        run buffer tool on "hospitals.shp" at 1,000 meters > return "hospital_buffs.shp"
        return tool messages
        run dissolve tool on "hospital_buffs.shp" > return "hospital_buff_diss.shp"
        return tool messages

This is was the output from my script when run, it shows the return messages from the various tools once they have run.

Monday, June 19, 2017

Washington D.C. Crime Analysis

This week I went over multiple ways to spatially analyse crime reports, using reports from January 2011 in Washington D.C. There was a lot going on this week so I'm going to toss up a map first and go into details afterward.


There is a lot going on in this map, first off we have the point layer showing the crimes reported that January. This data was separated by offense and graphed by occurrence, as seen by the graph in the corner of the map. This layer is the basis for our spatial analysis, using the point data of police stations in Washington I joined the points spatially to the crimes layer which links the data from each individual crime report to the closest police station. Dividing the crime reports up to their closest station point gives of the percentage of crimes handled by each police station based on location, the station points were symbolized accordingly. The next technique is similar but it begins with a multiple ring buffer from the police station points making a new parcel at half a mile, one mile, and two miles away from a police station. Then I used the same spatial joining to find out what percentage of reported crimes were which distance from a police station, which is explained in the map text. Finally I was tasked with proposing a place for a new police station or stations and I chose two potential areas. The first was chosen to relieve some of the stress from the station with the highest percentage of crime by distance. The second was chosen for a similar reason, the station with the second highest percentage of crime is nearby, also the second location would also pick up a number of crime reports that are currently outside of the two mile area around a police station.

Finally, I did a new type of spatial analysis called Kernel Density Analysis, again let's bring the map out first.

Kernel Density Analysis uses point placement to create a density raster based on how often a point class appears and how close those points are to each other. This maps breaks the crimes point class down into specific offenses and runs a kernel analysis on those points to create a density map which I've overlayed onto the population density map. This style of map allows the user to compare how the different type of crimes and where they occur compare to one another.

Friday, June 16, 2017

Geoprocessing with ArcGIS

This week in python I began learning how to use python to create tools and run models in and out of ArcGIS. I started with Model Builder in Arc. Model Builder is a simple way to run multiple tools in Arc and it allows you to run tools on the output of previous tools. For an example, in this lab the first tool I ran created a clip out of the soil layer that fit to the basin layer.

Inputs: Soil.shp - Basin.shp
    Tool: Clip > Output: Basin_Soil.shp

I can run the Clip tool myself, but with model builder I can run further tools on the output all at the same time. The next thing I wanted to do is remove the parts of the Basin_Soil.shp that weren't farmable land. To do this I needed to run the selection tool to select any shapes that contained the "Not Prime Farmland" attribute. Finally I needed to remove that selection, so I used the Erase tool to remove the selection of "Not Prime Farmland" from the previously created Basin_Soil.shp.

Inputs: Soil.shp - Basin.shp
    Tool: Clip > Output: Basin_Soil.shp
        Inputs: Basin_Soil.Shp
            Tool: Select>Query "FARMLNDCL" = 'Not Prime Farmland' > Output: Basin_Soil_Sel.shp
            Inputs: Basin_Soil.shp - Basin_Soil_Sel.shp
            Tool: Erase>Output: Basin_Soil_Farmland.shp

The final output is the soil layer clipped to the basin with all shapes not containing prime farmland removed. Here's what the output looks like.
       

The next step of the lab was to convert this model into a Python script, which is an export option in the model builder. Then I opened the created script in PythonWin and modified it so that the user could input their own shapefiles. The modification allows them to direct the script to run the same tools but on any two shapefiles they want, if there is no input it will default to the soil and basin shapefiles.

Monday, June 12, 2017

Participation Assignment #1: Putting GIS in everyone's hands.

    My article showcases one of the things I'm most excited about for the future of GIS. People that use GIS have begun to realize the power of crowd-sourcing their data and putting the power to impact maps in the hands of the public. My article focuses on a group of students in Detroit, Michigan who were given the software to collect data about their neighborhood, specifically for the purpose of improving the world around them. The students used the software and handheld computers to plot points on a map and to geotag pictures of areas in their neighborhood that needed to be improved or repaired. This showed them first-hand the power of integrating spatial data with their own input to make changes to their environment.

    The world we live in now is a world where seventy percent of adults have a smartphone, they have access to GIS and GPS data every day of their lives. GIS professionals have realized this and are beginning to utilize the population as a potential data source. Every GIS professional knows that they are hobbled without reliable and up-to-date data. While crowd-sourced data is very easy to dismiss as unreliable the more data you have the easier it is to use scripts to filter out false information or verify multiple similar reports. Now that data can be obtained directly from thousands of individuals without needing special equipment aside from the smart phone in their pocket we're moving towards a golden age of data acquisition for GIS.

Friday, June 9, 2017

Hurricane Lab

This weeks lab was broken down into two parts. The first used weather data to create a tracking path through the North Atlantic for hurricane Sandy in 2012. The second part used aerial imagery and new exploration tools to determine local damage caused by the storm.

The tracking data provided was a table that included barometric pressure, wind speed, and location data for the readings taken. I used the location data to create a series of point data on the map with the rest of the information assigned to each point. Then I used a point-to-line tool to create a path between the points showing the irregular path of hurricane Sandy. Finally the map shows the states impacted hard enough by the storm to request FEMA disaster assistance.


On shore, the aerial data from before and after the storm let me analyze the damage first hand. Using county parcel data overlayed atop the aerial imagery let's me deduce what type and how much damage was done to each structure along the coastline. In order to represent this data on the map I had to create a new point feature class and assign values to each point based on my estimations of the damage.

Tuesday, June 6, 2017

Debugging and Error Handling

This week was fun but it's going to be a little strange to write up. I practiced errors and debugging in PythonWin and while it's a valuable skill it's difficult to show in this format. I was given three scripts that included errors and I had to use the various debugging tools in PythonWin to correct them.

The first script was designed to list all the attribute fields in a shape file and both errors were syntax errors. Using the check tool in PythonWin points the user to the line containing the syntax error and from there I checked the line and corrected the errors. The scripts output, once corrected, looked like this.


The second script prints out the names of each of the layers from a map file, or .mxd file. The errors in this script were a combination of syntax errors and exceptions. I corrected the syntax errors as before and then went to the exceptions. Exceptions are when the script is modeled correctly but something prevents it from running properly. An example of some of the exceptions in this script were an incorrect filepath, one of the methods required three arguments and a fourth had been included. To debug these exceptions I ran the program and used the error message generation to locate the exception and correct it. The error message tells you the exception type and also the line it occurs, this helped me narrow down each exception until I could correct them all. The out put for this script looked like this.


The final script had two parts and only the first part contained errors. This last exercise was different, the goal was not to correct the exception but to catch it. In this instance I used a try-except statement, what this does is run the code and if there is an exception, instead of halting the code like normal python will continue to run the rest of the script while handling the exception. In my try-except statement I told python to store the error message as a variable so that I could print it out in my scripts output, like so:

 

Like one of the errors in script 2, one of the methods requires two arguments to run, but was only given one. Since this part of the script was nested in a try-except statement it continued to run the rest of the script instead of crashing.