Monday, September 26, 2016

Project 2 MTR Analyze week, using ERDAS for unsupervised image classification and ArcMap for final touches. I feel like I actually hiked up a mountain after this lab....

This week we were introduced to SkyTruth which is a technologically savvy non-profit organization which promotes environmental awareness by using remote sensing and digital mapping technology.  I highly recommend checking them out!  We practiced with some exploration within ERDAS and ArcMap in identifying MTR areas on imagery using a supervised classification method.  For the main part of our analyze lab, we used our individual landsat section which consisted of 7 rasters (bands 1-7) and performed an unsupervised classification in ERDAS.  With untrained eyes, it is very easy to misinterpret areas but I applied some similarities I had seen in the explore image and did my best.  We created 50 total classes MTR or NONMTR.  We then used ArcMap to reclassify the image to show only the MTR regions.  I created a quick exhibit to illustrate my findings in conjunction with the work we had completed in Prepare week.


Saturday, September 24, 2016

Module 5a Intro to ERDAS Imagine and Digital Data - Producing a class map within ArcMap

This week we learned the basics of the clearly powerful ERDAS Imagine software.  Our ultimate goal was the exportation of a portion of a .tif file of an area within Washington State to an image file for use in ArcMap.  I chose to use a portion of Jefferson County in Washington State that showed several different classes within a small area.  This information was brought into ArcMap, symbology was used to add a description for the area in hectares so that it could both a permanent part of the map and be depicted in the legend.


Friday, September 23, 2016

MTR Prepare Week-----not exactly the way I like to move mountains

Our prepare week map utilized our skills to take four DEMs, mosaic them and then create a hydrology dataset.  From this dataset we ultimately produced a stream shapefile (polyline) and a basin shape file (polygon).  We added the site boundary as this project (due to its size) has been divided into groups.  As group three, our focus for next week will be portions of the Appalachians in Kentucky and Ohio.  For this map, I illustrated the entire site area and in order to give it some location perspective I obtained U.S. Census data to illustrate the counties and states which were within the site area.  The streams and basins were identifiable in the legend and the basin color-coding helped our group three area stand out from the rest of the project area.


We also created a story map for the six basic steps regarding mountain top removal: http://arcg.is/2dLw3kn

Our final step was to begin our MTR Story Map Journal which is a work in progress to be finalized with report week.  It consists of placeholders and some image links which would not cooperate but will have an attitude change by next week!  http://arcg.is/2dLxhfn

Monday, September 19, 2016

Module 4 Ground Truthing and Accuracy Assessment

Our goal for this lab was to ground truth our LULC classification map from module 3 utilizing Google Maps earth view and its 3d pan/tilt/zoom features as well as street view feature which had much higher resolution than our original imagery.  I've included my overall approach to the project and my final map:

NOTES:
1.      Because of my previous work as a land surveyor and the need to use aerial imagery, Autocad, GIS and google earth/google maps I saved some google map searching time.
2.      Because I had made sure that my map projection and truthing shape file were in the NAD_1983_HARN_StatePlane_Mississippi_East_FIPS_2301_Feet projected coordinate system, the point I created were also in this system.
3.      By creating all the points and then using the identify feature set to display in degrees minutes and seconds, I was able to paste this information into the google map search field.
4.      This took me directly to where my point was located and I could use the earth view in conjunction with the 3d tilt/pan and zoom options to quickly identify the classification of my point.
5.      If it matched, I entered yes in the True_YN column during my edit session
6.      If it did not match, I entered no in the True_YN column and entered a more accurate classification in the New_Code column.
7.      After completing my ground trothing, I saved my edits, updated my legend and added an accuracy note with explanation to the final map.
8.      The majority of my initial mistakes from module 3 were in determining the wetland classification.  I knew that they should be comprised of either some sort of marsh grass or trees/mangroves, but it was very difficult to discern in the original imagery.  The google earth view was more helpful, but it was still difficult to discern as much of the area looked like mud flats. 

9.      I did a bit of additional research for that region to determine what the local forestry association described as the flora for these tidal wetlands.  The majority seemed to be varying types of marsh grass.  As that was not one of our specific provided codes I used non-forested wetland.  I think in this case, using a level 3 or 4 code would be more beneficial if actual detailed wetland information were needed.  I feel there may be some areas which are mud flats and oyster beds visible during low tide, but since I did not know at during what tide cycle the imagery was obtained I could not speculate as to the validity of this conclusion.


Saturday, September 17, 2016

Project 1 Network Analyst - Downtown Tampa Bay evacuation zones, shelters and routes

Scenario Three – Downtown Tampa Evacuation

As the downtown area of Tampa Bay is heavily populated it is important to create routes that will aid the general public in evacuating to the designated shelter for their zone as quickly as possible.  The downtown area is divided into three zones each with one shelter.  In order to to provide the general public with the information as rapidly as possible, the maps were produced for use in newspaper and television.  Again, ArcMap was used to create the network, and transportation routes.  The goal of this map was to illustrate routes for the 15 evacuation points within downtown Tampa Bay.  The routes were color coded and grouped in a logical flow.  While turn by turn directions were not provided, the major streets were labeled.  Ultimately the routes merged toward two main roads, I-275 and Nebraska Avenue.  The maps included staggered evacuation times for each zone so that road congestion was reduced.  Additional safety warnings were included on the map.  Because this map was to be used for newspaper and television, final preparations were made using Adobe Illustrator as this allows for more aesthetically pleasing graphic representations.  As with the other scenarios, the main drawback with the maps and drive times are that they are based upon typical conditions, not pre-hurricane weather conditions.  Drive times could be longer and there is the potential for the designated route to require alteration based upon existing road conditions.  In the event high winds or heavy rains have caused downed power lines, downed trees or flooding, actual routes may need to be altered “on-the-fly” by emergency personnel.



 

Scenario Four – Downtown Tampa Bay Evacuation Zones and Shelter locations

An information map was created to be used by both television and newspaper to depict the Downtown Tampa Bay Evacuation Zones and their associated Shelter Locations.  This map, designed for the general public, was quite generalized in nature so that one could quickly identify the zone in which they resided as well as the shelter for said zone.  The zones were color coded for simple identification and the shelters were both symbolized and labeled.  As this was not a directional map, only Major roads and water features were labeled.  The map was created using ArcMap and then finalized in Adobe Illustrator for simple, aesthetically pleasing results.  Text insets were provided to provide shelter addresses using text which was color-coded to match the evacuation zone color.  Additional information was provided with regard to standard emergency procedures and safety warnings.  The latest available radar image was included on the map for reference purposes.  This map combined with the evacuation route map were both readily accessible to the general public prior to any required evacuation so that families could develop a rapid response plan should the evacuation be mandated.

Sunday, September 11, 2016

Module 3 Land Use / Land Cover Classification

Our goal this week was to recognize elements of land use and land cover (LULC) classifications, learn to identify these various types of features on aerial photography and create a LULC map.

Below is my LULC map and a description of the features within the Codes:

Code Code_Descr Features
11 Residential Area containing single and multi family dwellings
12 Commercial and Services Any retail or fast food
13 Industrial Areas which appeared to be industrial sites
14 Utilities Water Tower
15 Industrial and commercial complexes Areas of both industrial and commercial uses
16 Mixed Urban or Built-up Land Land which was developed cut could not determine a specific use
24 Other Agricultural Land Land which appeared to have been planted but currently fallow
43 Mixed Forest Land Areas with Trees
51 Streams and Canals Water feature, mainly river and tributaries
52 Lake Water body
61 Forested Wetland Wetlands
62 Non-forested Wetland Wetlands
73 Sandy Areas other than Beaches Area which appeared to be sand but in an area not likely to be a beach
121 Cemetery Headstones, road network, and manicured grass
122 Education Schools
43/33 Mixed Forest Land/Mixed Rangeland Areas with Trees and grassland
61/62 Forested/Non-forested Wetland Wetlands
62/61 Non-forested/Forested Wetland Wetlands

Friday, September 9, 2016

Project 1 Analyze Week, creating a network dataset for use with Network Analyst Tools to create routes, service areas and closest facility evaluations

Using the information, we created in prepare week, we had several scenarios to evaluate and in order to do so our initial process was to set up a network dataset for use with our Network Analyst Tools.  Our focus this week was creating a map illustrating Evacuation of Patients from Tampa General Hospital on Davis Island, supply routes from the local Armory to three storm shelters, multiple evacuation routes for downtown Tampa and identification of shelter locations. 
      The Network Dataset uses our Street layer containing the additional fields for Seconds, Miles and Flooded which we set up last week.  Within our Network Dataset we established attributes for Seconds, FloodPublic and FloodEmer.  The Seconds usage is cost with seconds as the units and will allow us to determine fast, simple routes for evacuation; The FloodPublic usage is restricted and usage is prohibited which will calculate routes that avoid flooded streets closed to the general public; the FloodEmer usage is restriction with a usage of avoid high which calculates emergency vehicle routes which may need to use flooded streets but whenever possible will avoid them.  It is necessary to set Evaluators for the above attributes in the Network Dataset.  Attribute Seconds, the type was set to Field and value set to Seconds.  Attribute FloodPublic, the type was set to Field and value set to Flooded.  Attribute FloodEmer, the type was set to Field and value set to flooded.
      With the completion of our Network Dataset we began our scenario analysis and mapping.  Each scenario used the Network Analyst toolbar to create a New Route with a logical name matching its scenario.  Tampa General Hospital needed to evacuate patients to Memorial Hospital.  By selecting specific items in the Hospital attribute table, in this case the two hospitals in question, we could load only the selected items into our tool as stops.  The starting hospital (Tampa General) is the first stop and the destination hospital (Memorial) is the second stop.  Because we wanted fast, direct routes we chose to set our Impedance to Seconds and assigned Oneway as the only restriction.  The solve feature was then used to create the optimum route based upon our criteria.  Because of the potential number of patients to evacuate we performed the above steps again but used St. Josephs ad the destination hospital.  This allows for better patient accommodation and also evacuating patients to the best hospital based upon their particular health need.
      Our next task was to determine how best to deliver supplies from the U.S. Army National Guard Armory to local storm shelters.  Because we had three shelters, we need to create three routes beginning at the armory and ending at each shelter.  In this case, we needed to apply both our FloodEmer restriction and the OneWay restriction.  Each route was solved and displayed on the map.
      Scenario three required establishing multiple evacuation routes for the general public living within the Tampa Area.  The closest shelter for this specific area is the Middleton High School Shelter which was used for the destination for each of the calculated routes.  As this is a densely populated area it was important to account for the drive time based upon flooding conditions multiplied by our impedance attribute (known as a Scaled Cost function).  Before preceding we needed to assign scaled cost values to our DEM polygon layer.  The ScaledCost short integer attribute field values were based on grid_code.  Grid_code 0-3=3, 4-6= 2, 7-20= 1 and 9999= null (Null values do not affect the routing process and in this case the value represented water bodies).  By using the values as the multiplicative factor, drive times would be increased based upon the level of flooding.  Now that we have an updated DEM attribute table we were ready to return to our Network Analyst Tool.  Since these residents were all going to the same shelter we chose to create a New Closest Facility Route, in this case Middleton High School Shelter.  The DEM polygon was assigned to the Polygon Barrier.  For the load locations dialog we set the property to Attr_Seconds and the field to ScaledCost.  This was done so that our Seconds were used as the impedance and multiplied by the scaled cost.  The BarrierType default was also set to scaled cost.  Our next task was to load all of our TampaEvac points as Incidents and the shelter as the Facility.  By confirming that the Closest Facility layer properties Impedence was set to seconds, Oneway Restriction was on and Travel from set to Incident to Facility we were then ready to use the solve feature to creating routes from each evacuation point.
      Because other areas had shelters, scenario 4 created service areas and related them to their closest shelter.  This makes it easy for residents to quickly identify their designated shelter based upon the neighborhood in which they reside.  This analysis utilizes the Network Analyst New Service Area feature.  The shelters were added as Facilities.  Within the service area properties, we confirmed our Impedance was set to seconds and Default Breaks value set to 10000; this value represents drive time in seconds and is more than enough time to reach any location within the study area thus preventing exclusion of areas in the analysis if the allowed time were too short.  The Analysis Settings Tab Direction was set to Toward Facility.  Because we did not want our resulting polygons representing the service areas to overlap we set the Polygon Generation tab multiple facilities options to Not Overlapping.  We were then able to solve and produce three defined service areas, one for each shelter.

      With our analysis scenarios performed, we were able to modify our resulting layers to visually differentiate between route types and individual routes within a type category.  For additional clarification, I chose add labels to the symbols for hospitals and shelters.  I also created a DEM_Poly_9999_RH layer to clearly demarcate all water features.  Because the police layer contained both police stations and administrative facilities I chose to produce an additional police admin layer (fdem_policedept_aug07_CPrj_Rh_admin) so that one could visually distinguish administration facilities from actual police stations.  


Monday, September 5, 2016

Module 2 - Basics of Aerial Photography and Visual Interpretation of Aerial Photographs

This Modle was designed to provide basic skills in recognizing tone (brightness/darkness) and texture (smooth/rough) in areas and features; identifying features by their shape/size, shadow, pattern and aassociation; finally we compared a True Color Image to a False Color Image.  We used similar techniques for identifying areas (via polygons) and locations (via symbols) and converting them to .shp files so that they could be edited and labels added based upon information input into the attribute table.  Using similar marking techniques we learned to recognize how feature color changes depending on whether you are using a True Color or False Color image.

The first two exercises resulted in the following maps:


Thursday, September 1, 2016

Project 1: Prepare Week, Network Analyst

     The focus of Project 1 is to use ArcGIS Network Analyst for a Hurricane Evacuation Scenario.  The first week of the project focused on an overview of the project.  The lab concentrated on data preparation for the network analysis which included creating a basemap containing MEDS (in this case armories, hospitals, police stations, fire stations, shelters and roads), establishing the potential flood zones using an assigned criteria of lands less than 6’ above mean sea level, and editing Metadata as required.  The lab involved, clipping and projecting of both vector and raster data.  Additionally, since the raster needed to be part of our analysis it was necessary to reclassify it and convert to a polygon dataset.  We added mileage, time and flood categories to the Street attribute table in order to prepare ourselves for the Network Analyst tool next week.  As always, organizing our existing and created data is important and to that end we created a new file geodatabase to house our information; additionally, since the Network Analyst tools require a network dataset, we created a feature dataset to house our data for use in next week’s analysis.

Here is my basemap for our Tampa Bay Hurricane Evacuation Project: