Monday, August 7, 2017

GIS4048 Final Project: Lifestyle Search

My final project was based around the concept of a "Lifestyle Search." The idea behind a lifestyle search is to focus on the community you are about to move into rather than focusing on the house. So instead of beginning a search for homes when you are preparing to move, you instead come up with things you want around your new home and find communities that fit those criteria before you begin looking at homes.

For my final my imaginary client Mr. Nguyen has hired my firm to perform one of these lifestyle searches in Los Angeles County, California. He has provided a list of criteria and requested that we locate communities in Los Angeles County so that he can move his family to the area. His criteria are:

1. City near Los Angeles City that is close to a 24 Hour Fitness gym
2. Close proximity to a park or forest
3. Close proximity to a freshwater lake
4. A high percentage of the population between the ages of 35 and 39
5. A high percentage of the population is Asian
6. Is not in a High Fire Hazard area

I chose to perform a weighted analysis to find the most suitable locations to fit my clients criteria. In order to use this tool I needed to create rasters for the first five criteria that show the appropriate information. For the proximity criteria I used the Euclidean Distance tool and reclassify tool to create a raster showing scaling distance from the target area in two mile intervals.


As you can see the final rasters show the distance radiating out from the appropriate features. Next I performed a similar analysis for the demographic criteria, creating rasters and classifying the data in a similar fashion, showing the percentages classified into nine classes for the weighted overlay. Feeding both of these results into the weighted overlay tool gives me a new raster showing the best combined areas from all the criteria rasters. Using this tool I selected three areas from among the weighted overlay to suggest to the client as the place for his community search, like so.


I was surprised at how much I learned doing this final project. I technically didn't use any new techniques or tools performing this analysis but seeing a project through from start to finish was enlightening. I was surprised at how useful the lifestyle search idea seemed once I had performed the analysis, I felt like it was a valuable service and a good way to relocate to a new region.


Final Presentation

Tuesday, August 1, 2017

Module 11: Sharing Tools

The final module for GIS Programming and the last in our series on python scripting tools. This week was the final step in creating shareable script tools. There wasn't much different done from the previous weeks, the first new process was adding descriptive elements to the parameter inputs. Let's do the screenshot now as a visual aid for the explanation.


On the right side you'll notice that the tool parameter window is open and you can see some help text explaining what that parameter is for. This is a crucial part of finalizing script tools, it allows you to make the tool as user-friendly as possible by giving them as much information as you can to use your tool.

The next step I took was to import the script. Normally when you create a script tool it has to be paired with the actual script for it to function. When you import, or embed the script with the tool, that is no longer necessary. With the script embedded you can now send the toolbox with tool to anyone and it will be functional. Finally, once the script is embedded you can create a password for the tool that prevents users from modifying the tool or editing the script embedded in it.

Tuesday, July 25, 2017

Module 10: Creating Custom Tools

Somewhat of a short module this week but a very important one for me, one of the biggest questions for me going through this course was how I made my scripting practical. This week I learned how to make my scripting practical by creating custom tools. I've covered how to do plenty of geoprocessing tasks with a python script but each of those scripts contained the file paths to the data they were running the tools on. This means that those scripts would only ever work on that specific data unless you manually changed the script between each use. If you turn those scripts into a script tool however, you can change them to work off of user input instead of static inputs. The first step to doing this is to create a toolbox in arcmap and add your script to the tool. Part of adding the script allows you to set parameters, such as input data, output path, and workspace. You can edit the properties for the parameters to force them to only accept certain types of input as well, like rasters, or point shapefiles.


This is an example of a window that opens when you try to use a script with these parameters set. In this instance my script was made for clipping features, so I have a parameter for the source data workspace, the clip feature, the features to be clipped and then an output location for the results. Once you set the parameters you have to change the script to accept the user inputs instead of the normal static inputs. Using the arcpy.GetParameter() function allows you to do that. The parameters above are indexed, starting at 0, in the order you place them so in the line of my script where I set the workspace (the data source for my inputs) I would put arcpy.GetParameters(0) instead. This tells my script to look in the first index of the user input for it's information. Once you've changed all the scripts inputs to GetParameters you must do the same for any print statements. Since your script tool is being run in arcmap you can't use a normal print statement you have to use the AddMessage() function instead. This takes the body of the function in the () and outputs it as text to the arcmap processing window, like this:


Using this process You can turn any geoprocessing script into a permanent tool that you can use with any input, which is absurdly useful.

Wednesday, July 19, 2017

Module 9: Working with Rasters

This week was about working with rasters. This went from simple things like listing the rasters in a geodatabase to creating rasters based on specific criteria using the spatial analyst extension. The lab this week had me making a specific selection of data from supplied elevation and landcover rasters and then merging the resulting rasters together into a single raster that fit all the criteria. I'll break the code down piece at a time and explain how it works.

>import arcpy, env, arcpy.sa
>set workspace and overwrite=true

Standard opening, though this time I imported all the specific functions from the sa module of arcpy, which is the spatial analyst module. This lets me use a lot of shortcuts when accessing my geoprocessing tools.

>Create file geodatabase for storing results "AEdmundson.gdb"

Just creating a new geodatabase to stick my final raster in. It was optional as I could just save my final raster to a tif file instead.

>If Spatial Analyst extension is available:

If the person running the script doesn't have the spatial analyst extension or the license for it then the script won't do much for them, so this checks to see if python can access it. If it can't an else statement later on will tell them so.

>>Check out SA extension

This is python "taking" the extension for it's use. I think of it like checking a book out of a library.

>>Variable1=remapvalues 41, 42, 43 > set to 1

This is a variable to use in the next tool, it tells python that it wants all the landcover values that equal 41, 42, and 43 to equal '1' when the raster is reclassified. Those original values are the assigned values for types of forest cover.

>>reclassify(Landcover raster, "VALUE", Variable1, "NODATA")

This is reclassifying the raster, it takes the original landcover raster and each entry that was assigned a 41, 42, or 43 is classified as a 1 under the "VALUE" attribute, anything that isn't a 41, 42, or 43 is given the null data, or '0' in that field.

>>Variable2=Elevation Raster

Just assigning the elevation raster to a variable. You can technically just call the rasters location when you want to use the geoprocessing tools but using the .Raster function makes python play with it a little differently. I think.

>>AspectVariable=Aspect(Variable2)
>>SlopeVariable=Slope(Variable2)

This defines two new variables, which creates two new temporary rasters one which contains the Aspect from the elevation raster and the other the Slope. The Slope identifies the degree of any slopes by using the change in elevation data from cell to cell and Aspect uses the same information to determine which direction slopes are occuring.

>>SlopeLow >5
>>SlopeHigh <20
>>AspectLow >150
>>AspectHigh <270

These variable are once again creating new temporary rasters. The first creates a new raster from the Slope raster I created above and it includes only the cells whose slope is greater than 5. The next does the same but for cells with slopes less than 20. Do a similar function. So at this point I have created five temporary rasters, one that reclassified the forest areas as 1 and everything else as 0, then four more rasters showing ranges of Slopes and Aspects from the elevation raster.

>>FinalRaster=Reclassified+SlopeLow+SlopeHigh+AspectLow+AspectHigh

Finally, we smoosh them together. This merges those 5 temporary rasters into a new one showing the areas that fit between the slope and aspect criteria and classified as forest or not-forest.

>>FinalRaster Save > AEdmundson.gdb

Saving the temporary raster as a permanent raster into my GDB I made earlier.

>>Check in SA extension

Have to put the book back now that I'm done with it.

>else:
>> print "No spatial analyst extention found"

Just in case they don't have the SA extension.

Here's the output raster from my script.


Tuesday, July 18, 2017

Participation Assignment #2

The paper I found was about the creation and implementation of the website Climate Wizard. The idea behind Climate Wizard was to make information about climate change more accessible to the layman and to provide more technically inclined people with a way to access detailed information. Using GIS and Python the creators of Climate Wizard provide even non-specialists with a simple analysis and graphical depictions for how climate has and is predicted to change throughout the world. Using historic trends and current anomalies in temperature and precipitation the team predicts how environmental change will occur over the next thirty to sixty years. The results of Climate Wizards analysis are consistent with changes reported by the Intergovernmental Panel on Climate Change and is a great example of how GIS and Python can be used to develop tools that are accessible and practical for the general public and specialists alike. Making the data available as a public website that contains the analysis tools brings the power and flexibility of GIS to the peoples hands, as they are able to alter the parameters of the analysis and see the effects of those changes in real time in your local area or globally. With the help of programming and more advanced data collection techniques GIS is quickly going from a tool for specialists to a tool for everyone. It started with simple mapping software and is evolving to include crowd-sourced data gathering tools and simplified analysis tools for public use. Climate Wizard is a great example of that, a well documented but easy to use website allowing anyone to access and use data to learn about and predict their environment.

Sunday, July 16, 2017

Urban Planning - GIS for Local Government

This weeks lab focused on the role GIS plays in Local Government, more specifically how it's used to present data for taxation and classification of parcel lots. The goal of the lab was to create a map for a client who is interested in a piece of property. The map needed to include surrounding parcels along with the parcel owners and the zoning codes for those parcels. In most states in the U.S. the usage for land is determined by the zone it's in, such as residential, commercial, or agriculture. The zone of the parcel you're buying as well as the surrounding parcels can weigh heavily on your decision to purchase a property.

To present the data my client requested I created was is called a Map Book. Instead of giving them a single large map showing the entire study area, the area is broken up into smaller maps with a locator inset map showing you which part of the overall area you are looking at. Here is an example page of the Map Book.


You can see in the top left corner the inset map shows the entire study area, with the current map page highlighted and the unused pages darkened. Included with the map is a Parcel Report listing the parcel and owner information for each of the labelled parcels in the map book. I created the map book using a tool called Data Driven Pages and it it's what allowed me to create a multiple page map instead of a single large map. The data to create this map was obtained from the Marion County Appraisal District web page. With the map provided my client can see the land owners, zoning information, and recent sales information for the parcels surrounding his prospective purchase.

Saturday, July 15, 2017

Urban Planning - Participation Assignment

The urban planning participation assignment's goal was to introduce me to how local government determines property tax and assessed values. My local appraisal district is the Montgomery Central Appraisal District, you can find their website here www.mcad-tx.org and their GIS map is found near the top of that page or here http://portico.mygisonline.com/html5/?viewer=montgomerytx.bv1-p1 

The second part of this participation assignment relies on your appraisal district providing a list of recent home sales to analyse. Montgomery County however is in a non-disclosure state, which means that property transaction sales are not made available to the public. So instead of providing a link to that portion of the website, I will instead discuss how my county handles this. While property sales are not publicly disclosed they can be privately disclosed and the methods that the county uses to find the sales values rely on that. The first method is a simple questionnaire that is mailed to new owners asking them to disclose the value for the purposes of assessing the surrounding land. The second method uses the real estate system, title companies and real estate agencies keep their own record of home sale values in order to prepare market value assesments and the appraisal district takes advantage of this. Finally, even though values are not disclosed with home sales they are disclosed with mortgage loans and in most cases when a buyer purchases a home with a vendor lien the Deed of Trust for that lien is filed at the same time and the Deed of Trust will have the amount of the mortgage loan on it. I chose one of these homes as my example land sale so that I could use the Deed of Trust and a realty website for my assessment. I chose the property at 4527 Coues Deer Lane, Conroe, TX 77303, with the assessors account number of 4006-00-06700.

A page for a local realty agent for this property has the range of values assessed for the market value: $250,000 - $285,000.

The transfer documents, which can be looked up at the county clerks office under document numbers 2017-001256 for the Warranty Deed and 2017-001257 for the Deed of Trust. The Deed of Trust lists the loan amount as $269,086, which falls within the market value range from the Realty website.

The account page for this property on my appraisal districts website lists the appraised value at $260,840 which is a 97% accuracy.

Finally I was given some appraisal data about a subdivision in Escambia County, Florida and was asked to make a map showing appraised values, then to select some properties for review.


Typically in areas grouped as Neighborhoods, at least in my county, you would expect to see very similar values among similar types of houses. In this instance the low values parcels in either corner are either undeveloped land or are devoted to resources, like a water pump station. Since the majority of other parcels fall in the $24-27 thousand range as expected I would only want to review the two outlier parcels. The yellow parcel, valued at $24,700 is only slightly lower than the parcel above it, so I would even ignore that one and only order a review of the $33,250 parcel.