Thursday, November 24, 2016

Analyze Week 1 QGISresults for use with Mapbox, Leaflet and HTML code for online map publication - Free but not always easy

This entire module has focused on using free openSourced GIS applications.  In prepare week we created and analyzed Food Deserts in Urban Escambia County.  We used the data we had created to make zip files for our food deserts, grocery stores and food oases respectively.  These zip files were imported into our MapBox style and symbology was created using color brewer.  We were careful to note the RGB and HEX values for the colors as we needed one set for Map box and the HEX values for Leaflet.  We grouped our food desert layers in Mapbox and the duplicated them to represent the classification style we had chosen.  I chose to use 5 Manual Breaks for ease of interpretation and what I felt to be an accurate representation of the data from prepare week.  Once I was satisfied with my Mapbox results I moved on to working with leaflet to create a publishable map template.  Using a tutorial from their website as an example we were able to copy the source code to notepad and then edit the paths, directories and commands as needed to produce both a text file and an HTML file with UTF-8 encoding.  This HTML was our resulting map.  During the notepad editing process we chose the leaflet option from the Develop with this style dialog box in Map Box for the map we had created.  This path was then pasted into the appropriate locations within the leaflet notepad text so that leaflet would read the URL of Mapbox and create the internal map link.  We then used leaflet (which uses open street map for its underlying basemap) to create our map opening location, labled city pop up, a circular food oasis and a polygon representing a food desert.  The latter two items could be hovered upon to see what they represented.  We created a legend within our notepad code and referred to our hex values for the assigned colors and our mapbox map for the associated values.  Our final step was to enable a find feature which utilized a geocoding plug in to locate specific areas on the map by choosing the magnifying symbol and typing in an address or city.

Here is the link to my published map:  http://students.uwf.edu/rh51/GIS4930_SpecialTopics/leaflet/webmapsource.html

Happy Belated GIS Day

I had spectacular hopes of doing this event while I was at my work conference after our “conference day” was done.  Clearly that was a silly idea as each day we were provided with a different evening event which included dinner.  Convincing non-gis folks into skipping dinner or paying for it elsewhere so we had a location where I could set up my computer and not have to shout over 1800+ people was out of the question.  When I returned home, with the flu I might add, I used my spouse and some other family members who were in town to listen as I reviewed my blogs with them.  I think they especially liked the surgical mask I wore to prevent spreading my germs.  Thank goodness for the 72” tv as I could keep my distance from them and project the laptop to the tv.  I knew one day I could rationalize that purchase!  Anyhow, I spent a few hours going over the maps we have been creating from the beginning of this program forward.  Most of them had never heard of GIS so it was exciting to see the spark when they made connections with what they see in the real world and how it comes together.  My brother, who lives in Manhattan, even got in on the mix as he had mailed me a political map from the New York Times and asked if this was done with GIS.  I had previously tortured him with my classwork when he last came to visit.  That was also a nice feature to show everyone.  They were all very interested in how GIS aided in preventing and responding to both natural and terrorist disasters as well as the effective use of statistical data to make predictions to aid police in crime fighting or determining better locations for new stations.  There was one aunt particularly interested in the map we did on Wine consumption in Europe, I believe she is now considering relocating abroad, lol.  Overall the event went well.  They were all excited to see what I’ve been doing locked away in the home office for the past year.  Because we live in South Florida, they liked the segment on hurricane tracking and analysis as well as the exercise on damage assessment.  They all just thought the tv studio had map backdrops and they just animated them with stuff.  They had not made the correlation that the data from tracking planes and other information was used to build a GIS model of the event that could be described pictorially.  I think they all left with a deeper appreciation for GIS and for those of us interested in continue to use and develop new concepts for mapping.    

Please see the attachment for one of the maps I explained using the big screen.


Sunday, November 13, 2016

Utilizing open source GIS Software (QGIS) in Conjuction with ArcMap Analysis Toolbox Near Tool to evaluate the presence of Food Deserts in Urban Escambia County

The initial part of this project gave us an opportunity to practice using QGIS so that we could import data, symbolize layers and compose maps.  Also included where the steps to create essential map elements.  This portion of the lab was a bit clunky since it was new software, the directions didn't always match the options on my screen and it was very glitchy.  I upgraded to the newest version rather than the one we were required to download at the beginning of the semester.  This reduced quite a number of glitches, but did increase the difficulty of following the lab directions as many software improvements had been made and the terminology was slightly different.  None the less by the time I finished Portion A I was confident I could tackle Portion B.  This Portion focused on the effects of urban sprawl and grocery stores as they affected the community's ability to conveniently access food, especially with no motor transportation.  I created a map illustrating areas that contained food deserts (no grocery store within one mile) and food oases.  I also included a brief explanation of the terminology as well as the statistical summary.


Saturday, November 5, 2016

Module 10 - Supervised Classification of Germantown, MD

The first portion of this lab allowed us to experiment with a supervised classification.  We used tools such as the signature editor, imported or created and AOI layer, used UTM coordinates to hone in on classification types and used the polygon method versus the grow/grow properties option to "train" the classification tool.  After our first attempt we used the histogram plots and Mean plots to evaluate the results of our spectral signature.  We set the signature colors to approximate true colors using a band combination that would have the least spectral confusion.We then applied and saved our signature file.  We then used the Classify-Supervised tool to classify our image.  We then merged multiple classes of a similar nature to narrow down our actual classes.  Once complete we generated a distance image and a recoded image.  We then used the attribute table for the recode image to add our class names and calculate the area.  It took me a few tries to get the entire process good results.  I then moved on to the actual assignment which was to perform a supervised classification of Germantown MD.  Here is my end result:

Friday, November 4, 2016

Project 3 Analyze Week - Using independent variables and an OLS regressions to predict methamphetamine lab locations

We were given extra time for this lab and I can see why!  We began with 31 independent variables and ran 20 OLS regressions each time removing one variable and analyzing its effect on the final outcome.  Before illustrating the final OLS regression result I want to indicate my methodology.  The initial 3 checks were used to determine whether the independent variables were helping or hurting the model, were the relationships in line with the expected results and were there redundant explanatory variables.  Items evaluated were probability, ideally this value should be as small as possible to indicate statistical significance. We used the cut-off >0.4 for removing independent variables.  The next check was the Value Inflation Factor (VIF) which represents if there are multiple variables which similarly effect the model.  In this case, we set the baseline for removal candidacy as >7.5.  The third check was the variable’s coefficient.  A strongly positive or strongly negative coefficient is an indicator of the relationship between the dependent and the independent variable.  Numbers near zero (absolute value less than 1), indicate that the variable has minimal effect on the model and may need to be removed.  By simultaneously analyzing these three criteria at each iteration the impact which the individually removed independent variable had on the model could be determined.  In some cases, one may need to return a removed variable to the regression even if it first appeared unimportant as each iteration produces new results which impact all variables.  After twenty iterations, the next of the six checks were to be employed.  Check 4 determined whether the model indicated bias.  By bias represents non-linear trends, outliers or skewed results.  Conveniently the analysis results within each OLS provided the Jarque-Bera Statistic score which is the result of a check for bias.  If the p-value was <0.05 and had an asterisk it was an indicator of bias.  By employing scatter plots and graphs the potential independent variables causing the bias could be identified and adjusted within the next regression; combining these tools with the ability to visualize the histograms the potential issues were quickly identifiable. This would suggest reevaluating the OLS routines to improve results.  Check 5 was used to confirm that important independent variables had not been removed.  By examining the residual standard values within the map generated via the OLS routines, the range of results could be graphically identified.  Ideally a range between -0.5 and +0.5 indicates an accurate prediction.  It is important to note that a negative standard residual means that the model predicted fewer locations for meth labs than were identified in the original data; conversely a positive standard residual means the model predicted more locations for meth labs than identified in the initial data.  The final check, check 6, reviewed the models ability of predicting the dependent variable (in this case meth lab density).  By reviewing the R-Squared value the predictor was that the higher the value the more accurate the model.


The resulting map from the final regression is: