Our goal this week was to export our Dot Density map to a KMZ file and then use that information in addition to some placemarks to create a recorded tour of South Florida. Although we had previously created the Dot Density map, we needed to save it to a new map and perform some tweaks. In order for the dots to be visible, we made the background hollow. My urban land layer was already off and my surface water was symbolized by category; my legend needed to be recreated in a more simplistic fashion in order to work with Google Earth. Once those tasks were complete, I exported the Map to a .kmz using the Map to KML tool. I then exported the Dot Density layer to a .kmz by using the Layer to KML tool. Once complete I exited my map, opened Google Earth Pro and imported my two .kmz files. I moved the dot density layer to my map layer, adjusted the color and then saved using the save place as function.
I then created a layer for my tour stops making sure to place them in a location that would be suitable for my zoom areas and to label them clearly. I also made sure to turn on the 3D buidlings layer so that they would be visible. When I began to record my tour, I found the mouse to be too difficult to control for smooth video; consequently, I did some research and used the information found here https://support.google.com/earth/answer/148115 to obtain the keyboard shortcuts. When rotating using the mouse, shift and arrow keys, the speed was too fast; I added the alt key simultaneously with the shift key and cut my speed by half. After a couple of practice runs I recorded my final video.
Wednesday, March 30, 2016
Monday, March 28, 2016
This lab focused on 3D mapping. We used ESRI training as the basis of our
learning and practice and them implemented some of what we had learned with our
own exercise. Our learning targets were
3D visualization techniques and converting 2D data to 3D. Using the ESRI tools we learned to create
base heights for rater and feature data, how to apply vertical exaggeration,
how to improve your map utilizing illumination and back ground color, how to
extrude above and below ground features containing elevation values and how to
extrude parcel values. Having completed
the ESRI basics it was time to apply some of our skills to converting 2D data
to 3D. We used existing data containing
building footprints and a raster surface with elevation data. We created random points from our building
footprint layer using the CID field. We
then added surface information to this sample points layer by using the raster
file z values. We then took our sample
points table and used the summary statistics tool to generate a single
elevation value for each building by using the z value mean. We were then able to join the table
attributes from our sample point statistics table to our BostonBldgs layer; we
exported this data as a file and personal geodatabasefeature class. We then used ArcScene to extrude our
buildings using the Z value. After
completing, we exported this layer to a KMZ file using the Layer to KML
tool. Here was the result in google
earth:
Both 2D and 3D have many individualized uses. 2D data can require less expensive software
or add-on tools, but can be limiting with respect to ease of understanding when
excess layers are incorporated into maps, especially in multivariate
situations. 2D also requires the end
user to be able to visualize any three dimensional information about the data
being conveyed. For instance, a user
unfamiliar with topography might not understand what the contour lines
represent. The same user, however, could
readily see the elevation changes with a 3D map. Creating 3D maps from 2D information can be
more expensive and more time consuming, but the ease of visualization,
eye-catching depictions, and user interaction can convey more information to
the end user. The ability to better
visualize environmental impacts, urban growth, and applied 3D information to
value components makes 3D mapping a great tool for planning and
presentation.
Thursday, March 24, 2016
Module 12 Geocoding, Network Analyst, Model Builder - where is it, what's the most efficient/cheapest way to get there, and saving time with models
This week's lab consisted of three parts. The first two focused on Geocoding and using the Network Analyst while the 3rd portion utilized ESRI training to implement the use of model building for multiple processes. I really enjoyed this lab, although I seem to enjoy them all. Can I get paid to do labs? The Geocoding portion walked us through the five essential steps for good geocoding: download the Tiger Lines and if not already projected or in a projection different than what you are using, project them; Import the .shp into your map along with the table which needs addressing and any other base map information you might want; create an address locator (or use one you've already saved or provided by ESRI depending on your need); Perform the Geocode; REVIEW AND REMATCH so that you get the optimal matched rate. We were able to explore the "candidate" option and the "pick from map" option. The candidate options may not be the best for your location, and if you want to be sure, verify via an outside mapping source such as Bing Maps or other published information. While the process may be a bit painstaking it is quite important to be sure you have the correct geocoded locations. We took our geocoding results and imported them into a new map along with our Tiger Line information and our county boundary. We were then able to add our Network Analyst extension and tool bar to create a route analysis. There are many options within the Network Analyst which affect how your route will be analyzed. We chose to use street addresses with our MyAddressLocator style, edit location names when adding a new location and loading locations using location fields. For the Location Snap Options we chose to snap locations when adding a new location and moving a location. Once our options were set we were ready to add some stops to our route. I randomly selected three stations and created the route. Once I saw it, I realized my order caused me to double-back; ArcDesktop makes this easy to change and I adjusted my stops to eliminate the issue. The Network Analyst tool allows for various parameters to be applied to your route analysis. We set our impedance to minutes and I chose Monday at 8am as my time of day and day of week. We allowed U-Turns but enabled Oneway, Turnrestrictions and UnbuiltRoadsRestrictions. We made sure to put miles as the distance unit and time attributes as minutes and away we went. Once we solved the route, we were able to generate turn by turn directions and export the data. Like I said earlier, this was really cool and fun.
The last par of our lab focused on Model Building to visualize, automate and share geoprocessing workflows. Although we did not delve too deeply, we were exposed to the basic elements of model building. I can see how building models for specific processes which you might need to repeat would be extremely helpful. It is very easy to create a basic task skeleton and then add or alter the parameters. I might get crazy with this, who knows. The color coded shapes of the various elements make it easy to distinguish input data, output data, tools having run and tools not ready to run. You can verify your model and quickly locate where your error lies. Often, if you find and correct an error early in the process chain, the remainder of your model will run flawlessly. I was really psyched with this tool and called my partner into the office to show her; she saw the functionality in the tool, but was not nearly as excited as I was and returned to training the puppy. At any rate, I look forward to learning more about this tool and creating more complex models. Here's the one we created for illustrating gas mains near schools which needed to be closely monitored for leaks:
Saturday, March 19, 2016
Module 10 Dot Mapping.......
This
week’s module introduced us to Dot Mapping.
Dot maps are used to represent the location of one or more instances of
a geographic occurrence. The dot symbol
represents a specific quantity of the occurrence and is placed in the relative
location of the occurrence. When
creating a dot map, conceptual or raw-count data should always be used. The dot map allows for a quick visualization
of the quantity and location of graphic occurrences. It is also useful for comparing distributions
of related incidences. Within this lab
we also learned how to spatially join tabular data, select the appropriate dot
size and unit value and use a masking layer to improve our dot placement. We were also given the opportunity to improve
upon ArcMap’s legend and create a more improved and informative dot
legend. For this task, I exported my
map to pdf and then used Bluebeam Revu to create the rectangles and insert the
appropriate quantity of dots. I then
exported a snapshot of those rectangles to ArcMap and added the necessary text.
Saturday, March 12, 2016
Week 10-11 Spatial Analysis of vector and Raster Data and Vector Analysis Part 2 - Buffers and Overlays
This weeks lab gave us practice using tow common modeling tools, buffer and overlay. We combined that with using ArcPy to run multiple scripts which was a great time saving trick. We practiced analyzing our vector data using spatial queries, practiced several of the 6 types of overlay features and applied the multipart to single part tool. The first portion of the lab gave us opportunity to practice these items. In the final portion of the lab we were asked to finalize our map by adding the conservation layer and determining which park sites met all the criteria and did not fall within a conservation area. Since the lab had walked us through creating our buffer_union_export feature class, I was able to use the erase tool from the Overlay toolset within the Analysis Tools toolbox to create a new feature class of the buffer_union_export features which fell outside of the conservation_areas feature class. Once this was accomplished, we needed to tidy up our maps and add all the essential map elements to produce:
Sunday, March 6, 2016
Module 8 - Flow Mapping ....stylized distributive flow....I can go with that
This week’s module was an
introduction into flow line mapping.
There are many types of flow maps, Distributive, Network, Radial,
Continuous and telecommunications to name a few of the most used. This lab, depicting immigration into the
United States for the year 2007 is a type of distributive flow map. We mapped the quantitative data found in the
U. S. Homeland Security Office of Immigration Statistics 2007 Yearbook of
immigration statistics. Since we did not
know the actual migratory paths, we used a stylized placement approach to
represent the general direction of flow and indicated quantities through the
use of proportional flow arrows. The
line weight of the flow arrows was calculated by taking the square root of the population
value of each Continent; we then chose a maximum value line width, in this case
15. With those two pieces of information
we applied the following formula using the highest population value as our
denominator, in this case Asia with 619.28 resulting in the following formula
for each of our continents where X=continents square root population value:
Width of line symbol=15*(X/619.28). This
calculation gave us a line weight for each continent that was proportional to the
immigration value for that region. This
made it easy to draw conclusions regarding relative quantity of immigrants per
continent.
Wednesday, March 2, 2016
Module at Isarithmic Mapping - Come on in the weather is fine....as are the hills, valleys, depressions...
This
week we learned about Isarithmic Mapping.
Within that scope, we were introduced to various methods of
interpolation, including PRISM, worked with continuous raster data and learned
how to symbolize and produce a complete and accurate legend. We worked with both continuous tone and
hypsometric symbology both utilizing hillshade relief. Additionally we learned to create contours to
overlay on our hypsometric symboloized map.
The map illustrated here was our final product and depicted the annual
precipitation for the State of Washington using climate data from
1981-2010. The Prism method interpolated
the data using the 30 year precipitation data and a DEM of the state for
elevations. The various algorithms take
location, elevation, coastal proximity, topographic orientation, vertical
atmospheric layer, topographic position and orthographic effectiveness of the
terrain in relation to the monitoring locations (stations), to develop a
suitable model for our map. We used the
Integer Spatial Analyst tool to convert the floating raster values to integers
so that we could create “crisp” contours and be able to truncate values as
whole numbers. Renaming the layer to
something less cryptic later facilitated making the legend more
functional. We classed the data using a
manual break method with 10 breaks (break values were provided within the
lab). After classing the data, we chose
a precipitation color ramp and applied a hillshade effect. This produced an attractive map with visible
relief. In order to make the relief even
more visible, our next task was to add contour lines. This was accomplished using the Contour List
Spatial Analyst tool. We assigned
contour values which matched our break values.
Although one would typically notate the contour interval in the legend,
this step was unnecessary in this case since our contours followed our
hypsometric steps. I symbolized the
contours using the temperature contour line available within the weather style
manager. The final steps were to add
essential map elements to create a finished, easy to understand, visually
appealing map.
Tuesday, March 1, 2016
Week 7 & 8 Data Input, Editing and Data Search - Finding your ducks and keeping them in the row.......
This midterm lab was designed to pull together our learning thus far. Using an assigned Florida County, we were to determine the project requirements, obtain appropriate data, select a suitable graphic projection and display nine data formats using logical layering so that the final product(s) were visually compelling and easily to interpret.
In addition to the five mandatory vector datasets and two raster datasets, I chose to use Invasive Plants and Wetlands for my additional Environmental dataset component. All of the data was readily available either on Labins or FGDL with the exception of the hydrography information. The hydrography data on FGDL, broken down by county, excluded Broward which was my assigned county. I ended up using information from SFWMD; this contained hydrographic information for the entire state and took a bit of manipulation to eliminate the unwanted attributes within my clipped area. Once I had obtained all the required data, I imported and projected (as needed) the data into an mxd file. I used this file to determine how many maps I needed to create and the best approach to combining items within maps for good presentation. I saved this file to my first map, removed unwanted data, symbolized remaining data and added required map elements. I then saved two additional copies of this file so that I could remove the first set of data and add the data required for the current map. I then symbolized the data, updated any source information and then moved on to my final map. This process helped me to keep my map layouts identical and to reduce recreating the inset map and other essential elements; instead, I simply needed to perform minor editing. I have attached my three maps and followed them with an outline of my process.
I.
Process Summary
Details
In addition to the five mandatory vector datasets and two raster datasets, I chose to use Invasive Plants and Wetlands for my additional Environmental dataset component. All of the data was readily available either on Labins or FGDL with the exception of the hydrography information. The hydrography data on FGDL, broken down by county, excluded Broward which was my assigned county. I ended up using information from SFWMD; this contained hydrographic information for the entire state and took a bit of manipulation to eliminate the unwanted attributes within my clipped area. Once I had obtained all the required data, I imported and projected (as needed) the data into an mxd file. I used this file to determine how many maps I needed to create and the best approach to combining items within maps for good presentation. I saved this file to my first map, removed unwanted data, symbolized remaining data and added required map elements. I then saved two additional copies of this file so that I could remove the first set of data and add the data required for the current map. I then symbolized the data, updated any source information and then moved on to my final map. This process helped me to keep my map layouts identical and to reduce recreating the inset map and other essential elements; instead, I simply needed to perform minor editing. I have attached my three maps and followed them with an outline of my process.
I.
Process Summary
Details
A.
Obtain Data and Project/define as needed
a.
Aerial Imagery from Labins.org
i.
Quarter Quad 1705NW, Gator Lake
1.
2004 RGB Albers, Units in Meters, .sid and .sdw
2.
Coordinate System NAD_1983_HARN
3.
Albers Projection
a.
Projection: Albers
b.
Datum: HPGN(NAD83)
c.
Units: Meters
d.
1st standard parallel: 24 00 00
e.
2nd standard parallel: 31 30 00
f.
Central Meridian: -84 00 00
g.
Latitude of projection's origin: 24 00 00
h.
False easting (meters): 400000.0
i.
False northing (meters): 0.0
j.
Graphics Type: MrSID
k.
World File: sdw
l.
Resolution: 1-Meter
m.
Color: "True Color" (not CIR)
4.
Raster needed spatial reference assigned. I confirmed the above information matched the
values in the Albers Conical Equal Area coordinate system and assigned it using
the Raster Dataset Properties box via Catalog window.
a.
Confirmed spatial reference by inserting in a
new map and verifying the Data Frame Properties/Coordinate System Tab/Layers
and Projection as well as inserted a topographic base map to confirm
alignments.
b.
County Boundary Information from FGDL.org
i.
U.S. Census Bureau’s Florida County Boundaries –
Statewide July 2011
1.
Coordinate System GCS_North_American_1983_HARN
2.
Albers Conical Equal Area Projection
a.
Confirmed spatial reference by verifying the
Data Frame Properties/Coordinate System Tab/Layers and Projection as well as
inserted a topographic base map to confirm alignments.
c.
Major Roads Information from FGDL.org
i.
FDOT RCI Derived Major Roads – Statewide –
January 2016, MAJRDS_Jan16
1.
Coordinate System GCS_North_American_1983_HARN
2.
Albers Conical Equal Area Projection
a.
Used clip tool to show only roads within Broward
b.
Confirmed spatial reference by verifying the
Data Frame Properties/Coordinate System Tab/Layers and Projection as well as
inserted a topographic base map to confirm alignments.
d.
Cities and Towns – FGDL
i.
cities_feb04.shp
1.
Coordinate System GCS_North_American_1983_HARN
2.
Albers Conical Equal Area Projection
a.
Used clip tool to show only cities/towns within
Broward
b.
Confirmed spatial reference by verifying the
Data Frame Properties/Coordinate System Tab/Layers and Projection as well as
inserted a topographic base map to confirm alignments.
e.
Invasive Plants – FGDL
i.
Fnaiip_jun10.shp
1.
Coordinate System GCS_North_American_1983_HARN
2.
Albers Conical Equal Area Projection
a.
Used clip tool to show only invasive plants
within Broward
b.
Confirmed spatial reference by verifying the
Data Frame Properties/Coordinate System Tab/Layers and Projection as well as
inserted a topographic base map to confirm alignments
f.
Public Lands – FGDL
i.
American Indian Lands – amindianlands_2008.shp
1.
Coordinate System GCS_North_American_1983_HARN
2.
Albers Conical Equal Area Projection
a.
Used clip tool to show only American Indian
Lands within Broward
b.
Confirmed spatial reference by verifying the
Data Frame Properties/Coordinate System Tab/Layers and Projection as well as
inserted a topographic base map to confirm alignments
ii.
Public Parks – gc_parksbnd_oct15.shp
1.
Coordinate System GCS_North_American_1983_HARN
2.
Albers Conical Equal Area Projection
a.
Used clip tool to show only Public Parks within
Broward
b.
Confirmed spatial reference by verifying the
Data Frame Properties/Coordinate System Tab/Layers and Projection as well as
inserted a topographic base map to confirm alignments
i.
Geodatabase and layer file; rivers and lakes
will need to be separated from the other data for ease of use.
1.
Coordinate System GCS_North_American_1983_HARN
2.
NAD_1983_HARN_StatePlane_Florida_East_FIPS_0901_Feet
a.
Due to the large amount of data, I created a
layer package (including the data) of only the Water Bodies portion. I saved this to my project directory.
b.
Unpackaged in catalog window. Once imported into my map, I ungrouped the
layer set and clipped the data to show only information within Broward County.
c.
I then projected the data from State Plane to
Albers Conical Equal Area Projection
h.
Wetlands – FGDL
i.
nwip_oct14.shp
1.
Coordinate System GCS_North_American_1983_HARN
2.
Albers Conical Equal Area Projection
a.
Used clip tool to show only Wetlands within
Broward
b.
Confirmed spatial reference by verifying the
Data Frame Properties/Coordinate System Tab/Layers and Projection as well as
inserted a topographic base map to confirm alignments
i.
DEM – Labins
i.
N26w081 and n27w081
1.
Coordinate System GCS_North_American_1983
a.
Use Raster Project to Project to Albers Conical
Equal Area Projection
b.
Confirmed spatial reference by verifying the
Data Frame Properties/Coordinate System Tab/Layers and Projection as well as
inserted a topographic base map to confirm alignments
c.
Use raster clip to clip to limits of Broward
County
Subscribe to:
Posts (Atom)