Wednesday, December 7, 2016

Final Project ~ Analysis of Food Deserts within The City of Port St. Lucie, Fl.

The most challenging part of this project for me was using all the combinations of the open source software; ESRI has me spoiled.  After analyzing the data I have to again say I was surprised by the results, but certainly stand behind them.  I have learned that what may be a food oasis to me could be a desert for someone else within my same community due to their age, income, lack of transportation or any combination thereof.  Because I live in a well-developed area in the heart of the city with access to many things, including grocery stores I did not expect to find the number of food deserts illustrated on the map.  Once I studied the locations it made perfect sense as over the years the city has annexed more and more properties and outside contractors had begun development.  When the economy began to fail several years ago, development stopped, including commercial entities such as grocery stores which were planned in proximity to many of the developments.  I’m happy to say that trend is beginning to turn so I expect that the food deserts will be reduced over the next few years.

Here is a link to my Leaflet Map (best viewed on a large monitor to avoid panning):


This is the link to my final power point project:

Sunday, December 4, 2016

Project 4 Analyze Week 2 ~ Food Desert Analysis in the City of Port Saint Lucie Florida using both ArcDesktop and Open source software (QGIS, Mapbox and Leaflet)

This week combined what we learned in prepare week about Food Deserts and QGIS and Analyze Week 1 where we utilized Mapbox and Leaflet.  The challenge this week was to select our own city and combine the skills we learned in all of the software components to produce a final map in leaflet viewable via a link directed to our I:/ drive.  Here is the link to my leaflet map:  http://students.uwf.edu/rh51/GIS4930_SpecialTopics/leaflet/leaflet_rh.html
Here is an image of the map but I recommend using the leaflet link to access the geocoder and zoom functions:


The tiling process essentially takes a continuous image and breaks it into portions known as tiles.  Each tile is about 256x256 pixels and since they are placed sided by side they appear to be a seamless map to the web user.    By combining tiles and zoom levels the process is very effective for using in web mapping applications.  Tiled maps cache efficiently so that your web browser can quickly recreate them from the cache instead of having to download each time you want to view the map.  Because they progressively load from the outside to the inside you can go ahead and zoom in to a general area and the map will catch up as the loading process is not interrupted.  Tiles are also simple to use and formatted for numerous technologies such as servers, web, desktops and even mobile devices.

As part of our lab we provided answers to some excellent "food for thought" questions.  I have included those here for parties interested in more detail of the map making process.  
1.       Explain your data including the source, processing required (did you create it? How?), and information about the quality/credibility of your food deserts and grocery stores.  
Initially I downloaded as much data as I could think to get.  I prefer to start with more, evaluate the data and then choose what best fits the project.  I used a combination of data from the census.gov tiger site as well as FGDL.  Because I decided to use the Basic style provided in Mapbox I did not need to utilize the road files I had previously downloaded, nor any information regarding bodies of water.  The data that became most helpful was the census block data (2010), City limit polygon for the City of Port Saint Lucie, county boundary polygons and grocery store locations from google earth exported to a KML file.  Once I determined the data I intended to use, I selected a coordinate system that was available both in ArcDesktop and QGIS as QGIS did not contain the initial Fl State Plane East I wanted to use.  I settled upon the UTM Zone 17N (EPSG:26917) which was the best choice given my options in QGIS.  Once this determination was made I performed a batch project on my shp files in ArcCatalog and placed them in a UTM17N folder.  This became my working folder from which I used QGIS to create my census block clips to eliminate the riparian boundary portions of the properties.  I converted the KML to shp files in ArcMap and added my other files within UTM17N to review my data and confirm it was all in the correct projection.  I also made sure to set the projection in QGIS.  Within QGIS there was other data which needed to be created from existing datasets.  I clipped the Census Blocks for the entire state to the City of Port St. Lucie shp file and defined the results as my study area.  I then created Polygon Centroids for my Study area.  In ArcMap I created a near file using my imported Polygon Centroids by making them the input Feature and my Grocery Stores as the near feature.  Since the City of Port St. Lucie is an urban area I used a search Radius of 1 mile.  I opened the centroid.dbf and saved it as near.csv using excel.  I was then able to add this layer to QGIS using the Add Delimited Text Layer making sure to choose no geometry before selecting ok.  I then needed to join my near.csv to my Study Area.  This was completed using the add vector join and STFID as the join and target field and made sure to cache join layer in virtual memory.  I could then use the attribute table to select features using an expression.  Any near distance equal to -1 was a food desert and by inverting the selection the remaining items were food oases.  I also need to create some basic statistics using the Food Desert Layer and the POP2000 as the target field.  Once all my data was organized and symbolized I chose the Basic Style in Mapbox,, imported my tilesets, created my layers and applied symbolization and other changes to the basic style.  I recorded the color scheme I wanted to use via ColorBrewer in both HEX and RGB with an application of 5 classes for later use.  Within Mapbox I grouped my food desert layer and then duplicated until I had 5 copies.  I set the parameters for my 5 class Jenks natural breaks classification (determined in ArcMap) and used the RGB information from ColorBrewer to delineate the food desert breakdown.  I made sure to record my map position (center lat/long and zoom level) so that I could later use this information in my leaflet code, both via notepad text and then saved as html utf-8 encoding when complete.  Once all Mapbox edits were complete I published my mapbox and then obtained the leaflet URL to add to my leaflet code.  I edited all required information in my leaflet code: title, set view coordinates and zoom level, map size, geocoder coordinates representing the center of the city and of course pointing all file paths to the correct location.  (Since I work from my C drive they were locally pathed to the leaflet folder within the Analyze2 folder inside my Data folder.  I also confirmed that my leaflet_rh.html link worked properly and that my information displayed correctly.  I feel that my food desert results are fairly accurate as I used the most up to date data available.  Also as local resident, and city worker privy to development, I was able to determine that the pattern fit rather well.  The food deserts may have a slight overestimation as we have had recent growth and several new grocery stores built but all in all I’d give it at a minimum a 90% rating.  This is also due to Google Earth not having all of the new grocery store locations updated.

2.       Describe the data being represented. Are there trends you can identify?  If you used your local town to map, did you expect to see food deserts occurring where they are?  Does the data surprise you?

I used my local town, the City of Port Saint Lucie.    The data illustrated in my leafelet map shows the location of existing grocery stores (obtained from google earth) and the location of Food Deserts overlaid on a base map to visualize where they are located within the city.  Because I live in a well-developed area in the heart of the city with access to many things, including grocery stores I did not expect to find the number of food deserts illustrated on the map.  Once I studied the locations it made perfect sense as over the years the city has annexed more and more properties and outside contractors had begun development.  When the economy began to fail several years ago, development stopped, including commercial entities such as grocery stores which were planned in proximity to many of the developments.  I’m happy to say that trend is beginning to turn so I expect that the food deserts will lesson over the next few years.

Thursday, December 1, 2016

Week 14/15 Group 1 GIS Certification online Portfolio

This week’s portfolio experience was much easier than I expected.  I was dreading pulling everything together from all or our semesters and incorporating it with my resume and experience.  The Wix site (www.wix.com) was an excellent free source which I would recommend to others who wanted to post a quick and easy online portfolio.  The link could be placed on your paper resume, or included in your electronic submission as many applications now require this method.  You can link objects within it so that users can explore as much or as little information as they would like.  Here is a link to what I produced:

Thursday, November 24, 2016

Analyze Week 1 QGISresults for use with Mapbox, Leaflet and HTML code for online map publication - Free but not always easy

This entire module has focused on using free openSourced GIS applications.  In prepare week we created and analyzed Food Deserts in Urban Escambia County.  We used the data we had created to make zip files for our food deserts, grocery stores and food oases respectively.  These zip files were imported into our MapBox style and symbology was created using color brewer.  We were careful to note the RGB and HEX values for the colors as we needed one set for Map box and the HEX values for Leaflet.  We grouped our food desert layers in Mapbox and the duplicated them to represent the classification style we had chosen.  I chose to use 5 Manual Breaks for ease of interpretation and what I felt to be an accurate representation of the data from prepare week.  Once I was satisfied with my Mapbox results I moved on to working with leaflet to create a publishable map template.  Using a tutorial from their website as an example we were able to copy the source code to notepad and then edit the paths, directories and commands as needed to produce both a text file and an HTML file with UTF-8 encoding.  This HTML was our resulting map.  During the notepad editing process we chose the leaflet option from the Develop with this style dialog box in Map Box for the map we had created.  This path was then pasted into the appropriate locations within the leaflet notepad text so that leaflet would read the URL of Mapbox and create the internal map link.  We then used leaflet (which uses open street map for its underlying basemap) to create our map opening location, labled city pop up, a circular food oasis and a polygon representing a food desert.  The latter two items could be hovered upon to see what they represented.  We created a legend within our notepad code and referred to our hex values for the assigned colors and our mapbox map for the associated values.  Our final step was to enable a find feature which utilized a geocoding plug in to locate specific areas on the map by choosing the magnifying symbol and typing in an address or city.

Here is the link to my published map:  http://students.uwf.edu/rh51/GIS4930_SpecialTopics/leaflet/webmapsource.html

Happy Belated GIS Day

I had spectacular hopes of doing this event while I was at my work conference after our “conference day” was done.  Clearly that was a silly idea as each day we were provided with a different evening event which included dinner.  Convincing non-gis folks into skipping dinner or paying for it elsewhere so we had a location where I could set up my computer and not have to shout over 1800+ people was out of the question.  When I returned home, with the flu I might add, I used my spouse and some other family members who were in town to listen as I reviewed my blogs with them.  I think they especially liked the surgical mask I wore to prevent spreading my germs.  Thank goodness for the 72” tv as I could keep my distance from them and project the laptop to the tv.  I knew one day I could rationalize that purchase!  Anyhow, I spent a few hours going over the maps we have been creating from the beginning of this program forward.  Most of them had never heard of GIS so it was exciting to see the spark when they made connections with what they see in the real world and how it comes together.  My brother, who lives in Manhattan, even got in on the mix as he had mailed me a political map from the New York Times and asked if this was done with GIS.  I had previously tortured him with my classwork when he last came to visit.  That was also a nice feature to show everyone.  They were all very interested in how GIS aided in preventing and responding to both natural and terrorist disasters as well as the effective use of statistical data to make predictions to aid police in crime fighting or determining better locations for new stations.  There was one aunt particularly interested in the map we did on Wine consumption in Europe, I believe she is now considering relocating abroad, lol.  Overall the event went well.  They were all excited to see what I’ve been doing locked away in the home office for the past year.  Because we live in South Florida, they liked the segment on hurricane tracking and analysis as well as the exercise on damage assessment.  They all just thought the tv studio had map backdrops and they just animated them with stuff.  They had not made the correlation that the data from tracking planes and other information was used to build a GIS model of the event that could be described pictorially.  I think they all left with a deeper appreciation for GIS and for those of us interested in continue to use and develop new concepts for mapping.    

Please see the attachment for one of the maps I explained using the big screen.


Sunday, November 13, 2016

Utilizing open source GIS Software (QGIS) in Conjuction with ArcMap Analysis Toolbox Near Tool to evaluate the presence of Food Deserts in Urban Escambia County

The initial part of this project gave us an opportunity to practice using QGIS so that we could import data, symbolize layers and compose maps.  Also included where the steps to create essential map elements.  This portion of the lab was a bit clunky since it was new software, the directions didn't always match the options on my screen and it was very glitchy.  I upgraded to the newest version rather than the one we were required to download at the beginning of the semester.  This reduced quite a number of glitches, but did increase the difficulty of following the lab directions as many software improvements had been made and the terminology was slightly different.  None the less by the time I finished Portion A I was confident I could tackle Portion B.  This Portion focused on the effects of urban sprawl and grocery stores as they affected the community's ability to conveniently access food, especially with no motor transportation.  I created a map illustrating areas that contained food deserts (no grocery store within one mile) and food oases.  I also included a brief explanation of the terminology as well as the statistical summary.


Saturday, November 5, 2016

Module 10 - Supervised Classification of Germantown, MD

The first portion of this lab allowed us to experiment with a supervised classification.  We used tools such as the signature editor, imported or created and AOI layer, used UTM coordinates to hone in on classification types and used the polygon method versus the grow/grow properties option to "train" the classification tool.  After our first attempt we used the histogram plots and Mean plots to evaluate the results of our spectral signature.  We set the signature colors to approximate true colors using a band combination that would have the least spectral confusion.We then applied and saved our signature file.  We then used the Classify-Supervised tool to classify our image.  We then merged multiple classes of a similar nature to narrow down our actual classes.  Once complete we generated a distance image and a recoded image.  We then used the attribute table for the recode image to add our class names and calculate the area.  It took me a few tries to get the entire process good results.  I then moved on to the actual assignment which was to perform a supervised classification of Germantown MD.  Here is my end result:

Friday, November 4, 2016

Project 3 Analyze Week - Using independent variables and an OLS regressions to predict methamphetamine lab locations

We were given extra time for this lab and I can see why!  We began with 31 independent variables and ran 20 OLS regressions each time removing one variable and analyzing its effect on the final outcome.  Before illustrating the final OLS regression result I want to indicate my methodology.  The initial 3 checks were used to determine whether the independent variables were helping or hurting the model, were the relationships in line with the expected results and were there redundant explanatory variables.  Items evaluated were probability, ideally this value should be as small as possible to indicate statistical significance. We used the cut-off >0.4 for removing independent variables.  The next check was the Value Inflation Factor (VIF) which represents if there are multiple variables which similarly effect the model.  In this case, we set the baseline for removal candidacy as >7.5.  The third check was the variable’s coefficient.  A strongly positive or strongly negative coefficient is an indicator of the relationship between the dependent and the independent variable.  Numbers near zero (absolute value less than 1), indicate that the variable has minimal effect on the model and may need to be removed.  By simultaneously analyzing these three criteria at each iteration the impact which the individually removed independent variable had on the model could be determined.  In some cases, one may need to return a removed variable to the regression even if it first appeared unimportant as each iteration produces new results which impact all variables.  After twenty iterations, the next of the six checks were to be employed.  Check 4 determined whether the model indicated bias.  By bias represents non-linear trends, outliers or skewed results.  Conveniently the analysis results within each OLS provided the Jarque-Bera Statistic score which is the result of a check for bias.  If the p-value was <0.05 and had an asterisk it was an indicator of bias.  By employing scatter plots and graphs the potential independent variables causing the bias could be identified and adjusted within the next regression; combining these tools with the ability to visualize the histograms the potential issues were quickly identifiable. This would suggest reevaluating the OLS routines to improve results.  Check 5 was used to confirm that important independent variables had not been removed.  By examining the residual standard values within the map generated via the OLS routines, the range of results could be graphically identified.  Ideally a range between -0.5 and +0.5 indicates an accurate prediction.  It is important to note that a negative standard residual means that the model predicted fewer locations for meth labs than were identified in the original data; conversely a positive standard residual means the model predicted more locations for meth labs than identified in the initial data.  The final check, check 6, reviewed the models ability of predicting the dependent variable (in this case meth lab density).  By reviewing the R-Squared value the predictor was that the higher the value the more accurate the model.


The resulting map from the final regression is:

Thursday, October 27, 2016

Lab 9 Unsupervised classification using ERDAS - UWF high resolution .sid image using only visible bands of light (RGB)

This lab provided an exercise in both ArcGIS and ERDAS Imagine to perform an unsupervised classification.  The deliverable component came from the ERDAS Exercise 2 portion of the lab.  We utilized the Unsupervised tool within the Raster tab Classification group and set the nuber of classes to 50, accepted the approximate True Color for the color scheme and assigned Red as 3, Green as 2 and Blue as 1; we also used a maximum iteration of 25 and a convergence threshold of 0.950 and set the skip factors to 2 for X and Y.  Our next task was to reclassify the 50 classifications we just created via the attribute table.  We selected known items in the image which were the highlighted in the table; we then set the Class_Name field to one of our 5 categories: trees, buildings/road, grass, shadow and mixed; we also changed the color of our new class to something appropriate.  We repeated this process until we were left with items which were difficult to discern and placed them in the mixed category.  We then turned on the original .sid image and used a combination of the Swipe, Flicker, Blend and Highlight tools by first selecting an unclassified item in the attribute table, changing it's color to something distinct and bright and using the tools to aid in identifying the item at which point we could properly reclass it.  This process was repeated until all 50 classes had been reclassified and assigned the required color.  At this point we used the Raster Tab, Thematic button Recode to merge our 50 classes into 5 for our final product.  This new recoded  image was then saved.  The image was imported into my ArcMap geodatabase and a final map created.  We also added a new column for Area to the attribute table.  These were then summed and used to develop permeable and impermeable acreages and subsequently percentages.  This information was included in the final map.

Sunday, October 23, 2016

Lab 8 - Thermal and Multispectral Analysis using ERDAS and ArcMap

      After reviewing the use of histograms and learning to combine multiple images in this case obtained using LANDSAT ETM+, we practiced converting the multiple images into a composite using the Composite Bands tool in ArcMap of the Layer Stack tool in ERDAS.  We also reviewed items from past lectures and applied them during our analysis phase.  This included the use, in ERDAS, of multiple views, histogram editing, and using the Discrete DRA tool.  We used symbology in ArcMap and altering color band layer associations in ERDAS during our Multispectral analysis.
      I noticed an area in the Midwestern portion of the image which caught my eye due to both its shape and coloration.  By using the ETM Composite image I created from the ETM layers 1-8, I had the basic image ready to adjust.  Again, I prefer to work in a geodatabase so I imported each layer into the ETM.gdb and stored the ETMComposite there as well.  While running various comparisons made different areas of the image stand out, using the stretch symbology and reviewing each layer, visually I prefer the multispectral imagery.  I set Red to layer 4, Green to layer 2 and Blue layer 6 (the thermal layer).    As I panned around the image I noted a feature south of Guayaquil which was within an urban area, oval in shape with a bright green outline and dark red center.  This became my AOI.
I noted that by using the Stretched symbology the object was most visible in layers 1-3.  Band 6 did not illustrate central red blurs fading to yellows and then surround blues.  I knew the red blurs were potential hot spots, assumed the fading to yellow was a blend of greens and blues and that the outer more distant blues were urban areas.  By switching to the RGB composite symbology as described in the first paragraph, the shape and heat signature were much more clear.  I suspected that this could be some sort of civic area or sports arena.  Knowing the incredible following of Fútbol (soccer) I began to lean toward this area being a stadium.  The coordinate value obtained from the information icon in ArcMap  (Coordinate System WGS 1984 UTM Zone 17N, Projection Transverse Mercator, Linear Unit Meter, Angular Unit Degree, Datum WGS 1984)  is 79°55'29.684” W, 2°9.956"S.  I noted this on the map.  Wanting to confirm my theory, I entered those coordinates into Google earth and discovered that the location was in fact a sports arena known as Estadio Monumental Isidro Romero Carbo AKA Estadio Banco Pinchincha which is in the parish of Tarqui in Northern Guayaquil, Ecuador.



Friday, October 21, 2016

Project 3 Prepare Week using Statistical Analysis within ArcGIS to analyze existing meth labs and hopefully reveal potential locations for new methlabs

This week our focus was on importing the provided files to then perform various statistical analyses to produce a quality base map to be the basis of our future weeks.  We created and calculated attribute fields both manually and using a python script which required minor editing to function.  We used the spatial join function to join shape files. We obtained base map data such as roads, county boundaries, major cities and a state boundary to give our map a complete feel and a focus on our AOI.  This map also included meth lab locations.  We also performed readings of scholarly articles to use for the beginnings of what will be our final statistical report at the end of this project.  This is an interesting real-world project.  The intent is to use social and economic statistics paired with the location of existing meth lab seizures to develop potential methods to find a pattern in where meth labs arise.  This will also allow analysis of environmental impact by adding additional data for comparison.  Thus creating an environmental tool and a law enforcement tool.  Pretty cool stuff.


Tuesday, October 18, 2016

Module 7 - Performing Multispectral analysis and using the NDVI (index) to enhance specific image features

Both the exercises and the final lab were pretty interesting as they gave us insights in how to utilize the multispectral tab, NDVI creation and the use of the panchromatic tab as well as using histograms and adjusting the breakpoints and LUT (visible brightness on screen) Histogram; also worth mentioning is the Discrete Data button which will automatically adjust breakpoints to create a balanced histogram which produces good results for most but not all situations.  We also experimented using a few the tools found in ArcMap, but after a bit of fiddling with ERDAS, I find that I preferred it over ArcMap.  For our lab we were given three distinct features to locate by using the inquiry button and observing pixel values.  Once each feature was located we applied what we learned in the exercises and then created a subset & chip of the area after selecting the optimum band combination for each map.  Although some maps could have used the same band combination we were asked to use three distinct band combinations, one for each feature.  Below are the maps I produce in Arcmap based upon the multispectral analysis performed in ERDAS.





Thursday, October 13, 2016

Module 6 - Spatial Enhancement

I found this lab to be quite difficult regarding the final portion.  The exercises were easy to follow and made sense.  Using them in both ERDAS and ArcMap for cleaning up stripting in a Landsat image proved much more difficult.  Despite the lecture and readings, I still felt far out of my league.  I did additional research online, but failed to grasp the high-level concepts.  I did my best and here is my result:


Sunday, October 9, 2016

Mountain Top Removal - Report Week

This week concludes our study of mountain top removal in the Appalachian Coal Regions of Kentucky, Ohio, Tennessee, Virginia and West Virginia.  I was part of group three and our focus areas were portions of Kentucky and Ohio.  The map below was produced in ArcMap for the Analysis and reporting phase of this project.  That information was then used to create  a Group MTR Anlaysis Map and a final MTR Story Map Journal.


The link to the Journal is:  http://arcg.is/2dLxhfn

Monday, September 26, 2016

Project 2 MTR Analyze week, using ERDAS for unsupervised image classification and ArcMap for final touches. I feel like I actually hiked up a mountain after this lab....

This week we were introduced to SkyTruth which is a technologically savvy non-profit organization which promotes environmental awareness by using remote sensing and digital mapping technology.  I highly recommend checking them out!  We practiced with some exploration within ERDAS and ArcMap in identifying MTR areas on imagery using a supervised classification method.  For the main part of our analyze lab, we used our individual landsat section which consisted of 7 rasters (bands 1-7) and performed an unsupervised classification in ERDAS.  With untrained eyes, it is very easy to misinterpret areas but I applied some similarities I had seen in the explore image and did my best.  We created 50 total classes MTR or NONMTR.  We then used ArcMap to reclassify the image to show only the MTR regions.  I created a quick exhibit to illustrate my findings in conjunction with the work we had completed in Prepare week.


Saturday, September 24, 2016

Module 5a Intro to ERDAS Imagine and Digital Data - Producing a class map within ArcMap

This week we learned the basics of the clearly powerful ERDAS Imagine software.  Our ultimate goal was the exportation of a portion of a .tif file of an area within Washington State to an image file for use in ArcMap.  I chose to use a portion of Jefferson County in Washington State that showed several different classes within a small area.  This information was brought into ArcMap, symbology was used to add a description for the area in hectares so that it could both a permanent part of the map and be depicted in the legend.


Friday, September 23, 2016

MTR Prepare Week-----not exactly the way I like to move mountains

Our prepare week map utilized our skills to take four DEMs, mosaic them and then create a hydrology dataset.  From this dataset we ultimately produced a stream shapefile (polyline) and a basin shape file (polygon).  We added the site boundary as this project (due to its size) has been divided into groups.  As group three, our focus for next week will be portions of the Appalachians in Kentucky and Ohio.  For this map, I illustrated the entire site area and in order to give it some location perspective I obtained U.S. Census data to illustrate the counties and states which were within the site area.  The streams and basins were identifiable in the legend and the basin color-coding helped our group three area stand out from the rest of the project area.


We also created a story map for the six basic steps regarding mountain top removal: http://arcg.is/2dLw3kn

Our final step was to begin our MTR Story Map Journal which is a work in progress to be finalized with report week.  It consists of placeholders and some image links which would not cooperate but will have an attitude change by next week!  http://arcg.is/2dLxhfn

Monday, September 19, 2016

Module 4 Ground Truthing and Accuracy Assessment

Our goal for this lab was to ground truth our LULC classification map from module 3 utilizing Google Maps earth view and its 3d pan/tilt/zoom features as well as street view feature which had much higher resolution than our original imagery.  I've included my overall approach to the project and my final map:

NOTES:
1.      Because of my previous work as a land surveyor and the need to use aerial imagery, Autocad, GIS and google earth/google maps I saved some google map searching time.
2.      Because I had made sure that my map projection and truthing shape file were in the NAD_1983_HARN_StatePlane_Mississippi_East_FIPS_2301_Feet projected coordinate system, the point I created were also in this system.
3.      By creating all the points and then using the identify feature set to display in degrees minutes and seconds, I was able to paste this information into the google map search field.
4.      This took me directly to where my point was located and I could use the earth view in conjunction with the 3d tilt/pan and zoom options to quickly identify the classification of my point.
5.      If it matched, I entered yes in the True_YN column during my edit session
6.      If it did not match, I entered no in the True_YN column and entered a more accurate classification in the New_Code column.
7.      After completing my ground trothing, I saved my edits, updated my legend and added an accuracy note with explanation to the final map.
8.      The majority of my initial mistakes from module 3 were in determining the wetland classification.  I knew that they should be comprised of either some sort of marsh grass or trees/mangroves, but it was very difficult to discern in the original imagery.  The google earth view was more helpful, but it was still difficult to discern as much of the area looked like mud flats. 

9.      I did a bit of additional research for that region to determine what the local forestry association described as the flora for these tidal wetlands.  The majority seemed to be varying types of marsh grass.  As that was not one of our specific provided codes I used non-forested wetland.  I think in this case, using a level 3 or 4 code would be more beneficial if actual detailed wetland information were needed.  I feel there may be some areas which are mud flats and oyster beds visible during low tide, but since I did not know at during what tide cycle the imagery was obtained I could not speculate as to the validity of this conclusion.


Saturday, September 17, 2016

Project 1 Network Analyst - Downtown Tampa Bay evacuation zones, shelters and routes

Scenario Three – Downtown Tampa Evacuation

As the downtown area of Tampa Bay is heavily populated it is important to create routes that will aid the general public in evacuating to the designated shelter for their zone as quickly as possible.  The downtown area is divided into three zones each with one shelter.  In order to to provide the general public with the information as rapidly as possible, the maps were produced for use in newspaper and television.  Again, ArcMap was used to create the network, and transportation routes.  The goal of this map was to illustrate routes for the 15 evacuation points within downtown Tampa Bay.  The routes were color coded and grouped in a logical flow.  While turn by turn directions were not provided, the major streets were labeled.  Ultimately the routes merged toward two main roads, I-275 and Nebraska Avenue.  The maps included staggered evacuation times for each zone so that road congestion was reduced.  Additional safety warnings were included on the map.  Because this map was to be used for newspaper and television, final preparations were made using Adobe Illustrator as this allows for more aesthetically pleasing graphic representations.  As with the other scenarios, the main drawback with the maps and drive times are that they are based upon typical conditions, not pre-hurricane weather conditions.  Drive times could be longer and there is the potential for the designated route to require alteration based upon existing road conditions.  In the event high winds or heavy rains have caused downed power lines, downed trees or flooding, actual routes may need to be altered “on-the-fly” by emergency personnel.



 

Scenario Four – Downtown Tampa Bay Evacuation Zones and Shelter locations

An information map was created to be used by both television and newspaper to depict the Downtown Tampa Bay Evacuation Zones and their associated Shelter Locations.  This map, designed for the general public, was quite generalized in nature so that one could quickly identify the zone in which they resided as well as the shelter for said zone.  The zones were color coded for simple identification and the shelters were both symbolized and labeled.  As this was not a directional map, only Major roads and water features were labeled.  The map was created using ArcMap and then finalized in Adobe Illustrator for simple, aesthetically pleasing results.  Text insets were provided to provide shelter addresses using text which was color-coded to match the evacuation zone color.  Additional information was provided with regard to standard emergency procedures and safety warnings.  The latest available radar image was included on the map for reference purposes.  This map combined with the evacuation route map were both readily accessible to the general public prior to any required evacuation so that families could develop a rapid response plan should the evacuation be mandated.

Sunday, September 11, 2016

Module 3 Land Use / Land Cover Classification

Our goal this week was to recognize elements of land use and land cover (LULC) classifications, learn to identify these various types of features on aerial photography and create a LULC map.

Below is my LULC map and a description of the features within the Codes:

Code Code_Descr Features
11 Residential Area containing single and multi family dwellings
12 Commercial and Services Any retail or fast food
13 Industrial Areas which appeared to be industrial sites
14 Utilities Water Tower
15 Industrial and commercial complexes Areas of both industrial and commercial uses
16 Mixed Urban or Built-up Land Land which was developed cut could not determine a specific use
24 Other Agricultural Land Land which appeared to have been planted but currently fallow
43 Mixed Forest Land Areas with Trees
51 Streams and Canals Water feature, mainly river and tributaries
52 Lake Water body
61 Forested Wetland Wetlands
62 Non-forested Wetland Wetlands
73 Sandy Areas other than Beaches Area which appeared to be sand but in an area not likely to be a beach
121 Cemetery Headstones, road network, and manicured grass
122 Education Schools
43/33 Mixed Forest Land/Mixed Rangeland Areas with Trees and grassland
61/62 Forested/Non-forested Wetland Wetlands
62/61 Non-forested/Forested Wetland Wetlands

Friday, September 9, 2016

Project 1 Analyze Week, creating a network dataset for use with Network Analyst Tools to create routes, service areas and closest facility evaluations

Using the information, we created in prepare week, we had several scenarios to evaluate and in order to do so our initial process was to set up a network dataset for use with our Network Analyst Tools.  Our focus this week was creating a map illustrating Evacuation of Patients from Tampa General Hospital on Davis Island, supply routes from the local Armory to three storm shelters, multiple evacuation routes for downtown Tampa and identification of shelter locations. 
      The Network Dataset uses our Street layer containing the additional fields for Seconds, Miles and Flooded which we set up last week.  Within our Network Dataset we established attributes for Seconds, FloodPublic and FloodEmer.  The Seconds usage is cost with seconds as the units and will allow us to determine fast, simple routes for evacuation; The FloodPublic usage is restricted and usage is prohibited which will calculate routes that avoid flooded streets closed to the general public; the FloodEmer usage is restriction with a usage of avoid high which calculates emergency vehicle routes which may need to use flooded streets but whenever possible will avoid them.  It is necessary to set Evaluators for the above attributes in the Network Dataset.  Attribute Seconds, the type was set to Field and value set to Seconds.  Attribute FloodPublic, the type was set to Field and value set to Flooded.  Attribute FloodEmer, the type was set to Field and value set to flooded.
      With the completion of our Network Dataset we began our scenario analysis and mapping.  Each scenario used the Network Analyst toolbar to create a New Route with a logical name matching its scenario.  Tampa General Hospital needed to evacuate patients to Memorial Hospital.  By selecting specific items in the Hospital attribute table, in this case the two hospitals in question, we could load only the selected items into our tool as stops.  The starting hospital (Tampa General) is the first stop and the destination hospital (Memorial) is the second stop.  Because we wanted fast, direct routes we chose to set our Impedance to Seconds and assigned Oneway as the only restriction.  The solve feature was then used to create the optimum route based upon our criteria.  Because of the potential number of patients to evacuate we performed the above steps again but used St. Josephs ad the destination hospital.  This allows for better patient accommodation and also evacuating patients to the best hospital based upon their particular health need.
      Our next task was to determine how best to deliver supplies from the U.S. Army National Guard Armory to local storm shelters.  Because we had three shelters, we need to create three routes beginning at the armory and ending at each shelter.  In this case, we needed to apply both our FloodEmer restriction and the OneWay restriction.  Each route was solved and displayed on the map.
      Scenario three required establishing multiple evacuation routes for the general public living within the Tampa Area.  The closest shelter for this specific area is the Middleton High School Shelter which was used for the destination for each of the calculated routes.  As this is a densely populated area it was important to account for the drive time based upon flooding conditions multiplied by our impedance attribute (known as a Scaled Cost function).  Before preceding we needed to assign scaled cost values to our DEM polygon layer.  The ScaledCost short integer attribute field values were based on grid_code.  Grid_code 0-3=3, 4-6= 2, 7-20= 1 and 9999= null (Null values do not affect the routing process and in this case the value represented water bodies).  By using the values as the multiplicative factor, drive times would be increased based upon the level of flooding.  Now that we have an updated DEM attribute table we were ready to return to our Network Analyst Tool.  Since these residents were all going to the same shelter we chose to create a New Closest Facility Route, in this case Middleton High School Shelter.  The DEM polygon was assigned to the Polygon Barrier.  For the load locations dialog we set the property to Attr_Seconds and the field to ScaledCost.  This was done so that our Seconds were used as the impedance and multiplied by the scaled cost.  The BarrierType default was also set to scaled cost.  Our next task was to load all of our TampaEvac points as Incidents and the shelter as the Facility.  By confirming that the Closest Facility layer properties Impedence was set to seconds, Oneway Restriction was on and Travel from set to Incident to Facility we were then ready to use the solve feature to creating routes from each evacuation point.
      Because other areas had shelters, scenario 4 created service areas and related them to their closest shelter.  This makes it easy for residents to quickly identify their designated shelter based upon the neighborhood in which they reside.  This analysis utilizes the Network Analyst New Service Area feature.  The shelters were added as Facilities.  Within the service area properties, we confirmed our Impedance was set to seconds and Default Breaks value set to 10000; this value represents drive time in seconds and is more than enough time to reach any location within the study area thus preventing exclusion of areas in the analysis if the allowed time were too short.  The Analysis Settings Tab Direction was set to Toward Facility.  Because we did not want our resulting polygons representing the service areas to overlap we set the Polygon Generation tab multiple facilities options to Not Overlapping.  We were then able to solve and produce three defined service areas, one for each shelter.

      With our analysis scenarios performed, we were able to modify our resulting layers to visually differentiate between route types and individual routes within a type category.  For additional clarification, I chose add labels to the symbols for hospitals and shelters.  I also created a DEM_Poly_9999_RH layer to clearly demarcate all water features.  Because the police layer contained both police stations and administrative facilities I chose to produce an additional police admin layer (fdem_policedept_aug07_CPrj_Rh_admin) so that one could visually distinguish administration facilities from actual police stations.  


Monday, September 5, 2016

Module 2 - Basics of Aerial Photography and Visual Interpretation of Aerial Photographs

This Modle was designed to provide basic skills in recognizing tone (brightness/darkness) and texture (smooth/rough) in areas and features; identifying features by their shape/size, shadow, pattern and aassociation; finally we compared a True Color Image to a False Color Image.  We used similar techniques for identifying areas (via polygons) and locations (via symbols) and converting them to .shp files so that they could be edited and labels added based upon information input into the attribute table.  Using similar marking techniques we learned to recognize how feature color changes depending on whether you are using a True Color or False Color image.

The first two exercises resulted in the following maps:


Thursday, September 1, 2016

Project 1: Prepare Week, Network Analyst

     The focus of Project 1 is to use ArcGIS Network Analyst for a Hurricane Evacuation Scenario.  The first week of the project focused on an overview of the project.  The lab concentrated on data preparation for the network analysis which included creating a basemap containing MEDS (in this case armories, hospitals, police stations, fire stations, shelters and roads), establishing the potential flood zones using an assigned criteria of lands less than 6’ above mean sea level, and editing Metadata as required.  The lab involved, clipping and projecting of both vector and raster data.  Additionally, since the raster needed to be part of our analysis it was necessary to reclassify it and convert to a polygon dataset.  We added mileage, time and flood categories to the Street attribute table in order to prepare ourselves for the Network Analyst tool next week.  As always, organizing our existing and created data is important and to that end we created a new file geodatabase to house our information; additionally, since the Network Analyst tools require a network dataset, we created a feature dataset to house our data for use in next week’s analysis.

Here is my basemap for our Tampa Bay Hurricane Evacuation Project:

Tuesday, August 2, 2016

Final Project: Location Analysis, City of Port Saint Lucie, FL

Our final project was a location analysis designed to pull together the many new facets of GIS which we learned this semester for this project I used the following scenario:  Dr. Foramen has recently joined the Orthopedic and Spine Institute at St. Lucie Medical Center as Head of Surgery.  She and her son, Brad, will be relocating to the City of Port Saint Lucie.  Since Brad is in his senior year of high school, Dr. Foramen wants him to graduate before they move to Florida.  As she is currently working a full caseload at the Emory University Orthopaedics [sic] and Spine Hospital in Tucker, Georgia, Dr. Foramen does not have time to search for their new home location; compounding this problem is that Brad graduates in May and her new position as Head of Surgery begins in July.  She has hired my firm, RealGIS® , to evaluate potential homes based upon six criteria:  proximity to St. Lucie Medical Center, proximity to Indian River State College, a neighborhood with more owners than renters, a neighborhood with a higher percentage of 40-49 year olds, lying with the city limits and improved land value of no less than $400,000.  In order to produce professional results, the following objectives were established:  Objective 1: 
Calculate Euclidean Distance for the criteria specified by Dr. Foramen
a.      Location close to St. Lucie Medical Center
b.      Location close to Indian River State College
c.       Location within the city limits of Port Saint Lucie
Calculate the following items for the criteria specified by Dr. Foramen
d.      A neighborhood with a higher percentage of people 40-49 years of age
e.      A neighborhood with a higher percentage of owners than renters
f.        Improved values no lower than $400,000
Objective 2: Conduct a weighted analysis of the above criteria considering various priority levels.
Objective 3:  Identify potential locations based on Dr. Foramen’s criteria and the analysis results.
Objective 4:  Present a report and Power Point Presentation summarizing the methods, analysis and recommended locations for their new home.
Included within the Power Point Presentation were explanations and maps of the entire process as well as analysis results.  I chose to produce a basemap of St. Lucie County, a map with 6 data frames to address above items a-f and a map depicting two weighted analysis data frames and two larger neighborhood/parcel level maps which I had determined to be good choices for my client. 
            I am satisfied with the overall results of my project.  I feel the information would be helpful to aid my client and determining where they would like to live.  Because of the wide variety of data obtained, widening or narrowing the search would be a relatively easy task.  I would probably have lowered the base Improved Land Value criteria to $280,000-$300,000 as home values have not recovered quite as much as I had expected since the economic downturn.  The higher minimum of $400,000 made obtaining results a bit trickier than expected, but not impossible.  Including a photo array of the neighborhoods, work and school campuses and possibly some traffic statistics would be nice supplements to such a project.  The following link may be used to access my final presentation:

Friday, July 29, 2016

Module 11 - Sharing Tools

Our module this week took sharing scripts to the next level.  We learned to take an existing script, place it in a toolbox.  Once it was in our toolbox, we could now add various parameters using sys.argv[ ] avoiding hard coding paths.  While providing comments in a script is very useful, for someone unfamiliar with python, reading the script would be difficult.  The script tool uses a gui interface; this gui, is not automatically populated with descriptions about the various parameter.  Last week we learned where to place the description of the tool in the tool properties.  This week we learned how to use the Item Description tab in AcrCatalog to edit the syntax section for each of the parameters.  Adding information here provides built in help/guidance for the end user so that they know what type of information is expected for each parameter.  This allows them to execute the tool without a working knowledge of scripting.  Another advantage to using a script tool is that it can be embedded into the toolbox.  This allows for simpler sharing since only the toolbox would need to be shared, not the underlying script related to the tool.  In addition, once the script is embedded a password may be assigned.  The tool functionality is not affected by the password, but the ability to view and export the actual script is prohibited unless you know the password.   This prevents script piracy and ensures your hard work remains your hard work and others cannot take credit or modify its contents. The combined screenshots below illustrate the gui interface for the script tool and the results of utilizing said tool within ArcMap.




As this is our final blog post for the semester, we were asked to summarize the most memorable, interesting or useful things learned about Python with ArcGIS.  Honestly, the most memorable/useful thing from this course has been overcoming my fear of scripting.  After the first couple of assignments, I no longer dreaded the thought of tackling the next one.  This was huge for me and will greatly help not only future GIS work but also work involving other programming languages as I will not have the mental block I once had.  From the course overview perspective, I think the raster module will prove particularly useful and time saving in the future.  Additionally, the ability to setup scripts for repetitive tasks and be able to set them to use workspace parameters rather than hard coded locations is huge.  Combine that with the ability to share these time saving tasks with coworkers and you look like the department rock star.

Monday, July 25, 2016

Module 10 Creating Custom Tools

This week we focused on the use of custom tools to both automate workflow and share time saving processes with others.  Our goal was to create a tool box to house a script which would perform a clip on multiple objects.

Here are the notes from my process summary  about how to create a script tool and a valuable tip I learned, the hard way!
This list assumes you have a functional stand-along script named SAScript.py
1.       In either Arc Catalog or ArcMap, navigate to the location you want to create the script tool.
2.       Right click on the folder intended to house your script tool and select New, Toolbox.  Name your tool box.  In this case SAScript.tbx
3.       Right click SAScript.tbx and choose add script
4.       In the dialog box name your script (in this case SAScript); repeat in the label field and add a description of what your script does in the description field.  Toggle the Store relative path names to on before choosing next.
5.       Browse for your script file, choose open and then choose next.
6.       At this time, you can skip the adding parameters and just choose finish.
7.       Run the tool either in ArcCatalog or ArcMap; it will say there are no parameters but it will run and either work or return errors if your script has an issue.

8.       After checking this, right click on your script and choose properties.  You may know enter your parameters which will be dependent on the type of script you created.

TIP:  Don't overthink or apply "common sense" to the parameter properties.  Your output file location direction parameter should say Input!  If you change it to output as I did, the tool cannot write to an output file.  This will cause great consternation if you don't catch your mistake!



Tool dialog, Tool results and flow chart for creating the script tool:






Tuesday, July 19, 2016

Module 9 Working with Rasters

This module was all about working with rasters and the arcpy.sa module and functions.  We learned to list and describe raters, use them in geoprocessing, incorporate map algebra into the geoprocessing and use classes to define raster tool parameters.  Below you will find the area in which I struggled with the lab, my flowchart and my final raster result showing two classes, 0 and 1.

Which step did you have the most difficulty with? Describe 1) the problem you were having, and 2) the solution or correct steps to fix it. Be detailed with your explanation.
1.       I had the most trouble with the checking out of the spatial analyst extension.  Despite finding my code to appear accurate and the SA extension to be available my if else statement returned “The Spatial Analyst license is not available” from the else statement
2.       I carefully checked my code and the indentions.  All appeared correct.
3.       I confirmed within ArcMap that the extension was in fact toggled on, which it was.
4.       I closed ArcMap and ArcCatalog and checked the task manager to make sure nothing was open which would be using the license.
5.       I closed and reopened pythonwin, ran the script and received the same error.
6.       Online research indicated that this is a glitch and can often be resolved by rebooting.  I performed a reboot, but my script still returned the same message.
7.       Since I work locally, I tried doing the same steps in PythonWin on the Citrix server with the same results.
8.       I tried another reboot and when I still had no success, I rewrote the entire section of the script.  Success!