Tuesday, June 28, 2016

Module 7: Working with Spatial Data

This week we explored and manipulated spatial data by checking for, describing and listing data.  We worked with lists, dictionaries, Search Cursors, Update Cursors, Insert Cursors and learned to validate table and field names.  Here is a screen shot of this week’s script results in which we created a new geodatabase and populated it with information from an existing database.  Once created we listed the feature classes within our new database, used a Search Cursor and field delimiter to create a loop which would obtain names and populations for only County Seats.  This information was then used to update our empty dictionary; after updating, it was printed.

1.      I struggled with steps 6-8.  Initially I thought we were supposed to print the Search Cursor results which I was finally able to achieve after trying numerous variations of the exercises in the book and lab exercises.  I was finally able to get past my road block by adding Field Delimiter.  I then printed the resulting row.  As I moved on to the next section I realized we were supposed to develop a blend of steps 5-7 to achieve our results.  What was hanging me up with this portion was my placement of my empty dictionary creation.

2.      Once I placed my empty dictionary creating inside my for loop I was able to create variables for row 0 and row 1, update by dictionary and print the dictionary.



Saturday, June 25, 2016

Module 6 Prepare MEDS

This lab was about preparing MEDS for future use in an analysis for Homeland Security.  MEDS, Minimum Essential Data Sets, were developed by a joint effort of National Geospatial-Intelligence Agency, USGS and the Federal Geographic Data Committee through the Homeland Security Infrastructure Program (HSIP).  This data is critical for successful homeland security operation.  There are specific data requirements depending on whether an area is classed as an Urban Area or Large Area.  With over 3,300 counties and 85,000 municipalities in the United States, obtaining relevant, quality data is quite a challenge.  Levels of of geospatial data collection and management vary among these areas and is constantly evolving.  The MEDS criteria help to create essential requirements and organization for the vast amounts of data collected throughout the United States.  The data sets stipulated by the Department of Homeland Security (DHS) are Boundaries, Hydrography, Elevation, Transportation, Land Cover, OrthoImagery, Structures and Geographic Names.  MEDS data can be used to determine locations for surveillance cameras, where to place road blocks in the event of elevated security needs, areas which may be targets of terrorism, areas of potential mass gatherings requiring additional security measures and numerous other uses.  This comprehensive geospatial database is critical for not only homeland security, but also local communities in order to prevent, prepare, respond and recover during catastrophic events.  Here is our example of a MEDS dataset for:





As I typically do, I outline my steps as I create my map.  This helps me to remember specific steps for future use as well as create my data consistent and organized.  If I have difficulty with results, my steps are documented for troubleshooting purposes. (The outline formatting does not translate well into the blog, but I think the gist of it is understandable.)

I.                    Module 6 Meds Prepare

A.                  Preparing Meds Map

1.                  Review all data and complete metadata table

2.                  Create .mxd, name the data frame Boston MEDS Set display units to Meters and import coord system from Boundaries

3.                  Create Group Layers for each of the themes; add layers for Boundaries and Transportation and leave remaining layer groups empty.

B.                  Manipulate Transportation Data

1.                  Review the BMSA_Roads_pm attribute table and study the CFCC codes

2.                  Add the crcc table from the Boston_Data GDB and open to view table

3.                  BMSA_Roads_pm, Join attributes from a table CFCC; keep all records create index if asked

C.                  Export Roads by CFCC Classsifications

1.                  Select by Attributes

a)                  cfcc.CFCC >= 'A41' AND cfcc.CFCC <= 'A45'

b)                  data, export data same coord as feature dataset

c)                  BMSA_Roads_Local_rh, add to map

d)                  Turn off BMSA_Roads_pm

2.                  Adjust Symbology of Local Roads

a)                  Categories/Unique Values/CFCC

b)                  All symobls = transportation, A15 width 1

3.                  Repeat steps for Primary Roads and Secondary Roads

a)                  cfcc.CFCC >= 'A11' AND cfcc.CFCC <= 'A25'Primary – A20 width 1.5 and cfcc.CFCC >= 'A29' AND cfcc.CFCC <= 'A38' Secondary A25 Width 1.25

4.                  Show features at Specified Scale Range

a)                  In the general tab for the BMSA_Roads_Local_rh set don’t show layer when zoomed out beyond 1:100,000

b)                  Secondary roads 1:250,000, primary all scales

5.                  Show labels at specified Ranges

a)                  On labels tab for BMSA_Roads_Local_rh, scale range don’t show out beyond 1:24,000

b)                  Label field should be Full_Street_Name, used label style for North American Streets.  Set Character spacing to 6

c)                  Repeat for Primary and Secondary making Primary 12 and Secondary 10, change color for each.

D.                 Add Data to Hydrography Group Layer

1.                  Add NHDWaterboyd, NHDArea, and NHDPoint feature classes

E.                  Edit Land Cover Symbology

1.                  Extract Land Cover Raster by BMSA Boundary Mask

a)                  Add Land Cover raster to land cover group layer

b)                  Spatial analyst tools/Extraction/Extract by Mask

(1)               Inupt is Landcover; input feature mas is BMSA_Boundary_pm; ouput raster is BMSA_LC_rh (run in ArcCatalog with ArcMap closed)

2.                  Set dataframe extents to fixed extent BMSA_Boundary_pm

3.                  Change symbology by adding a color map

a)                  On symbology tab select coloormap/import colormap/NLCD.clr from BostonData.gdb

b)                  Edit labels per lab instructions

F.                   Add Orthoimagery and Elevation Layers

1.                  Add BMSA_Ortho_pm to orthoimagery group and BMSA_DEM_pm to the elevation group layer.

a)                  Change the pm to rh in the layer name

G.                 Add Geographic Names

1.                  Modify the schema ini file

a)                  Format=Delimited(|)

2.                  Create a Geographic names feature class from xy table

a)                  ArcCatalog MA_Features_20130404.txt, Create Feature Class From XY Table

b)                  Select Prim_Long_Dec from x  drop down

c)                  Select Prim_Lat_Dec from y drop down

d)                  Set coord of input cords to GCS NAD 1983 (Geographic Coordinate System North America NAD  1983

e)                  Save output as MA_GNIS_rh

3.                  Create new group layer Geographic Names in mxd

a)                  Add MA_GNIS_rh

b)                  Data management tools/projections and transformation/feature/project

(1)               Input MA_GNIS_rh, ouput dataset or feature class as MA_GNIS_SPCS_rh, output coord system NAD 1983 SP Mass Mainland FIPS 2001

4.                  Select by Attributes

a)                  Select 6 counties that make up boston metropolitan statistical area

b)                  COUNTY_NAME = 'Bristol' OR COUNTY_NAME = 'Essex' OR COUNTY_NAME = 'Middlesex' OR COUNTY_NAME = 'Norfolk' OR COUNTY_NAME = 'Plymouth' OR COUNTY_NAME = 'Suffolk'

c)                  Clear selection and the use select by location and use BMSA_Boundary_rh  and lying completely within.

d)                  Export to BMsA_GNIS_rh using the same coord as the data frame and add to map

e)                  Move to Geographic Names layer group and change symbology per lab instructions

f)                   Label using feature name, arial 8 burnt umber scale range 1:24,000 don’t zoom out beyond.

H.                 MEDS data management


1.                  Save each layer group as a layer file

Saturday, June 18, 2016

Module 6: Python Geoprocessing

This week we practiced geoprocessing within the ArcMap Interactive Python Window and in PythonWin.   For short and sweet items which you might not use again, the interactive python win is fine; for more complicated scripts and those I might use in the future, I prefer PythonWin.  Here is my flowchart and the script results:

This lab was pretty straightforward, here are my process steps:
1.      I used yEdGraph Editor by yWorks to create a flow chart before beginning my script
2.      Once I had my flow chart, I reviewed the tool help for each of the functions in ArcDesktop
3.      Because the help menu is so thorough, it was extremely easy to write this script.

For those of you that have not used yEdGraph Editor it you should try it.  It makes flow charting extremely easy and is available as a free download with no adware.  


Thursday, June 16, 2016

Week 5 Homeland Security - DC Crime Mapping

The focus of this week’s module was Homeland Security, DC Crime Mapping.  Our goals were to learn to establish workspace environments, analyze data stored in excel, create data using the xy tool, create an address locater, geocode tabular address data to point features, explore various marker symbol options, organize and prepare data for use in a geodatabase taking care to use proper nomenclature and logical naming conventions, calculate fields using the field calculator within attribute tables, create buffers and spatial joins, implement the swipe and flash tools for use in analysis, create multiple data frames to convey information, perform crime analysis using multiple spatial join options as well as Kernel Density, present information and use the analysis to propose a location for a new police station.  For the first part of the lab we created a map which included a graph of total crime by offense in the DC area, Police Stations/percent crime by station using thematic range graded symbology, percent crime with buffered distances from police stations and finally the location of the proposed police station.  I chose to display the buffered distances in a table format for clarity.  In the second portion of the lab we created kernel density maps showing burglaries, homicides and sex abuse crimes within the DC area.   This process was completed using the Spatial Analyst Tools, Density, Kernel Density.  We experimented with different search radii, 605, 1500 and 5000 square kilometers.  The smaller the radius the more detail, the larger the radius the more generalized it is.  I selected 1500 as it provided a smooth, easy to interpret kernel density.  Resulting Maps:



Wednesday, June 15, 2016

Module 5 Geopocesing in ArcDesktop

Part 1, Step 2: Explain how I set up my SoilErase model.
1.       Initially I reviewed what our goal was. 
2.       I then determined what inputs and tools I would need to use as well as the sequence in which they needed to run.  Since this was not complicated, I just jotted the information down on some scratch paper
3.       I then opened model builder and pulled the soils and basin layer into the blank screen
4.       I then pulled the clip tool into the screen and set the input feature, clip feature and output feature class
5.       I then pulled the select tool into the window, set the input, output feature class and SQL expression so that not prime farmland was selected from the previous clip tool result.
6.       I then pulled the erase tool into the window, set the input feature to basin, the Erase_Features to the previous selection result (soils_Clip_Select.shp) and the feature class to basin_Erase.shp.
7.       For the clip results and selection results, I made sure Add to Display was off; for the final result, I toggled the Add to Display on so that I could see the results in my Basin.mxd

8.       Between each tool set I chose autolayout, Full Extent, Verified and then ran the model to catch any errors early on in the process.  

The images below show the results after I ran my model and the flow chart of my overall process.

Sunday, June 12, 2016

Module 4 Hurricane Tracking and Damage Assessment

This week’s natural hazard topic was Hurricanes.  The lab had two parts, the first to create a tracking map of Hurricane Sandy and the second to perform a damage assessment post Hurricane Sandy for one street in Tom’s River Township, New Jersey.  We reviewed the contents of an excel spreadsheet and then created data from it using the Display xy tool.  These became our tracking points for Hurricane Sandy.  We learned how to use the Points to Line Feature tool to create a polyline between the Hurricane Sandy track points.  In order to enhance the aesthetics of the map, we customized the symbology for the hurricane symbol by using the Marker Symbol Options.  We used a VB expression to customize the labels for the track points so that they showed the wind speed and barometric pressure at each point. We learned to use the Grids option in our Data Frame Properties so that we could include and label Graticules (lines dividing the map by meridians and parallels).  In the damage assessment portion of the lab we created a new file database to house our assessment data.  Because we had numerous rasters, both pre and post storm, we needed to create raster mosaics of each class.  This information was stored in our DamageAssessment.gdb.  Within the same geodatabase we created a new Feature Dataset to house information specific to New Jersey.  Within that data set we added multiple feature classes to create the counties, municipalities, state boundary and road systems for New Jersey.  We used the Effects Toolbar to enable the Flicker Tool and the Swipe Tool.  The Swipe tool allowed us to see both the before and after imagery by simply sliding our mouse in the direction we wanted the top image (post storm) to move.  This was a huge help during our visual damage assessment.  Another important skill we learned was how to create attribute domains which can be shared across feature classes, tables, and subtypes within a geodatabase.  We created Domain Names for Inundation, Structure Damage, Wind Damage and Structure Type.  We assigned the domain properties, coded values and the user-friendly description of the coded values.  This greatly reduced the possibility of entry error because it created drop down fields for the user to select the appropriate value.  We then created a feature class to contain our damage assessment points and added the fields we had created to our feature class.  After adding some base map data from our New Jersey feature Dataset we were able to start editing the StructureDamage feature class by creating features and attributes.  This is where the swipe tool came in very handy.  Once the points were created, we symbolized them according to damage level.  We then used the attribute table to extract information to compile our damage assessment table which we placed on the map.  We added some inset maps for location purposes as well as all required cartographic elements to produce our final product.


Monday, June 6, 2016

Module 4 Debugging and Errors


Screen Shots of all three scripts interactive window:












Screenshot of Flow Chart:





















Errors corrected in Script 1:
Line 10, corrected incorrect syntax for file path (/ to \)
Line 12, corrected improper capitalization for variable fc to FC

Line 14, corrected improper spelling so that the line read for field in fields:

Process Summary notes for Script 3:
1.      Initially I ran the script to locate the error and type.
2.      I used a try statement at line 13 and an except statement and print command on lines 15 and 15 to generate the error that two arguments for needed in line 14
3.      I then ran the script to locate the next error and type.
4.      I used a try statement at line 17 and an except statement and print command on lines 24-26 to print “parks” and then return the error that “mapdoc” and “lyrlist” were not defined.
5.      I ran the script again to locate the next error
6.      I used a try statement at line 27 and an except statement and print command at lines 30-31 to return the “mapdoc” and “lyrlist” were not defined.
7.      I then removed all my try and except statements except the first try statement at line 13.
8.      I placed my except statement at line 23 except (TypeError, NameError): and a print command on lines 24 and 25 to return the error types and the layer name that was supposed to be returned if the script was corrected.
9.      When I ran the script again, the errors for part A printed and the script continued to run through Part B and generate the Name, Spatial reference and scale of the dataframe.

Participation Assignment #1 - “The Bureau of Land Management Uses Esri Story Maps to Encourage Public Land Exploration”

I found an interesting article through Directions Magazine.  This site provides links to GIS related information from multiple sources.  This particular article was in Esri Technology.  Titled “The Bureau of Land Management Uses Esri Story Maps to Encourage Public Land Exploration”, the article describes the recent evolution of improving information release to the public to encourage travel to our public lands.  Being a huge proponent of travel to State and Federal parks, this article caught my interest.  The Bureau of Land Management (BLM) has had a public awareness campaign called “My Public Lands”.  This campaign is now being enhanced using Esri Story Maps.  Bob Wick, a BLM wilderness specialist and photographer, published a series of travelogues every Tuesday last winter.  Known as the #traveltuesday campaign, this Tumblr blog extends the bureaus GIS into a public outreach tool via social media.  While an exciting step for the BLM, it is also a positive step forward for ESRI as they are beginning to see national organizations using Story Maps in new ways to engage the public.  Wick also produced a Summer Road Trip which was a photo journal through America’s forests, deserts and canyons.  This project was also based on the Esri Story Map Journal Template.  While this photo journal gave potential visitors a preview of what they could expect it did not offer quite as much as the #traveltuesday project which includes a personalized travel record including interactive maps of each of Wick’s destinations.  Wick, a 28-year veteran of the BLM is quite familiar with their GIS maps.  “ By integrating photography, multimedia and maps, the Esri Story Map journals bring America’s beautiful public landscapes to life…Our potential visitors can use the journals as real travel guides or the public can experience their lands virtually from any location.”  (Wick)  I think this travelogue is a fantastic idea.  I love the idea of having photography accompanied with GIS based maps of the site.  This article contained several links which are worth checking out if you have interest in our amazing Public Land System!  The link to the main article is:  http://www.directionsmag.com/pressreleases/the-bureau-of-land-management-uses-esri-story-maps-to-encourage-public-land/469535.  I hope you enjoy reading the article and exploring Wick’s story maps via the article’s links.

Wednesday, June 1, 2016

Module 3 Tsunami Evacuation - How not to get swept away!


This week’s two part lab introduced us to the devastating effects of Tsunamis.  The March 2011 earthquake, a magnitude of 9.0, off the northeast coast of Japan triggered a Tsunami which hit the coast of Japan minutes later.  While proper maintenance and organization of spatial data is important, it becomes critical during a disaster.  This lab focused on creating a structured file geodatabase.  This process, performed within ArcCatalog, allowed us to familiarize ourselves with the data and then logically organize and create what we needed using the tools with ArcToolbox.  This process allowed us to refresh our skills in metadata review, map projections, converting xy data into points and building raster attribute data for later use in analysis.  Once geodatabases were created we used the information to create radiation evacuation zones for the Fukushima Daiichi Nuclear Power Plant as well evacuation zones for the Tsuanmi.  The radiation evacuation zone was created using the multiring buffer tool.  The evacuation zones for the tsunami event were based on conditional raster analysis.  Roads, nuclear power plants and cities were intersected with the runup information to assist with evacuation decision making.  We were again able refresh our skills by performing queries and using VB expressions. By using Model Builder we were able to perform multiple tasks in succession and save our model for future use and sharing.  This week’s lab really began to tie together many of the skills we learned last semester and help us to solidify our knowledge and skills.