Thursday, December 8, 2016

Corridor Analysis & Feature Extraction

Goals and Background

The goal of the lab is to introduce me to necessary skills required to analyze corridor lidar point cloud data. I will be introduced to the steps required to project a LAS dataset and then gain practical experience utilizing a terrestrial lidar scan. The second half of the lab I will b extracting building footprints for a different data set and extracting LOMA information from the results.

Methods

Projecting an unprojected point cloud

The following methods were performed in ArcMap 10.4.1. A backup copy of the data was made before performing any of the following steps.

I was provided a terrestrial lidar scan which was not projected. The first step was to add LP360 Tools to My Toolbox within ArcMap (Fig. 1).

(Fig. 1) LP360 tools added to My Toolboxes in ArcMap.
The data set did not have any coordinate information assigned to the file. I consulted the Metadata to determine what coordinate system the data was collected in. Then with the Define LAS File Projection tool I input the LAS file and set the coordinate system per the Metadata (Fig. 2).

(Fig. 2) Define LAS File Projection tool used to define the projection of the LAS file.
After defining the projection then I had to use the Reproject LAS Files tool to permanently project the LAS data (Fig. 2). The input LAS file was the Defined file from the previous step. The coordinate system was set again per the metadata in both the incoming and outgoing parameter.


Transport and transmission corridor asset management

The following methods were performed in LP360 for Windows. I used the projected file of the terrestrial lidar data from the previous step.

I opened the LAS file in LP360 (Fig. 3). The data of a terrestrial dataset does not look any different than conventional lidar data in the traditional view. To see the variation of terrestrial lidar data you have to examine the data through the 3D viewer (Fig. 4)

(Fig. 3) Terrestrial dataset opened in LP360 displayed by elevation.

(Fig. 4) Terrestrial dataset displayed in the 3D viewer in LP360.
(Fig. 5) Terrestrial dataset displayed in the 3D viewer in LP360. Zoomed in view to a bridge within the study area. Notice the street lamps and road sign.
I used the measurement tool in LP360 to measure the bridge and sidewalk widths in the study area. Additionally, I panned around the 3D window to examine areas where utility lines had the potential to be damaged by trees in the event of a storm (Fig. 6).

(Fig. 6) Examination of trees encroaching on utility lines in the study area.

Building Feature extraction

I used LP360 for Windows again in this section of the lab. I used my classified lidar dataset from Lab 3.

To extract the building footprints I created a new point cloud task. The parameters were set as shown in Fig. 7.

(Fig. 7). Point cloud task window with the parameters set to extract the building outlines form the study area.
The result produces two shapefiles which include a building footprint and building footprint square (Fig. 8 & 9). The footprint is based on the exact classification. The footprint square makes generalizations of the footprint based on the classification and conventional building shape by creating 90 degree corners.

(Fig. 8) Footprint and footprint square displayed in LP360.

(Fig. 9) Footprint and footprint squared displayed in LP360. The image is zoomed in to display the difference in the footprint (yellow) and the footprint square (blue).

Extracting building height and LOMA information

The following methods were preformed in LP360. The purpose of the section is to deteremine the buildings were can apply to be excluded from the LOMA floodplain map since the height requirement was changed from 810 feet ASL to 800 feet ASL.

The first step to extracting building heights was to create a new point cloud task. I used a Conflation task with the parameters set as per Fig. 10.

(Fig. 10) Parameters used in the conflation task to extract the minimum and maximum height values using the building square footprint.
I then created another new conflation task in the point cloud tasks. The parameters were set as displayed in Fig. 11.

(Fig. 11) Parameters used in the second conflation task.
Finally I created one last conflation task in the point cloud task window. The parameters were set as displayed in Fig. 12.

(Fig. 12) Parameters used in the last conflation task to calculate the minimum z value for the buildings in the study area.

Results


(Fig. 13) Map displaying buildings which are below the 810 foot ASL LOMA requirement.


(Fig. 14) Map displaying buildings which are below the revised 800 foot ASL requirement.
Sources

Terrestrial LAS for Algoma, WI, project boundary KMZ, and metadata are from Ayres Associates.

LAS, tile index, and metadata for Lake County are from Illinois Geospatial Data ClearingHouse. NAIP imagery is from United States Department of Agriculture Geospatial Data Gateway. Breaklines is from Chicago Metropolitan Agency for Planning.

Wednesday, November 30, 2016

Lab 8: Vegetation Metrics Modeliing

Goals and Background

The purpose of the lab is to provide us practice extracting variations of forest metrics from LiDAR data. During this lab I will utilize land cover data to decipher tree species distribution and extract the various metrics for each tree species. During the conclusion of the lab I will make a recommendation to the U.S. Forest Service about the carbon sink potential of the forest in the study area.

Methods

All of the following operations were preformed in ArcMap 10.4.1 except for the DTM creation which was completed in LP 360 for Windows.

Canopy Height Modeling

I created a two Digital Terrain Model (DTM) from the Eau Claire County LiDAR data I was provided. The first DTM was created from the first returns of the data to produce an accurate model of the vegetation height. The second DTM was created from the last returns of the data to produce an accurate model of the ground surface.

I utilized Raster Calculator to subtract the ground DTM from the vegetation DTM which produced a raster image with the various vegetation heights. The produced raster does not contain an attribute table so I used the Copy Raster tool to change the pixel type to 32_BIT_SIGNED to calculate the height information.

Above Ground Biomass (AGB) Modeling

Estimating the AGB requires the use of the canopy height model created in the previous step. Additionally I needed land use/land cover data. The data for the study area was provided to me by my professor. The first step was to select various tree species from the the land use/land cover. I reclassed the data into the following 5 separate classes:

  1. Hardwood (Northern Hardwoods + Central Hardwoods + Swamp Hardwoods + Bottomland Hardwoods)
  2. Red Maple
  3. Pine
  4. Oak
  5. Aspen (Aspen/Paper Birch + Aspen Forested Wetlands)
All other species were eliminated from the study area.

Using an algorithm developed by Tabacchi et al. (2011) I was able to utilize model builder in ArcMap to calculate the AGB for each species in the study area. The equasion used was AGB= a+b*(dbh)^2*H. The parameters are as follows:

  • AGB= Above ground biomass
  • a=gain
  • b=offset
  • dbh= Diameter at breast height
  • H=height
All of the parameters I used were provided from in a chart derived from Jenkins et. al. 2003 (Fig. 1), except the height. The height parameter was calculated from the canopy height model I created above.

(Fig. 1) Chart of algorithm parameters from Jenkins et. al. (2003).
The next step was to create a feature class of each of the 5 species groups. I again used model builder to separate the classes and apply the algorithm to each class to calculate the AGB (Fig. 2). I used a conditional statement (Con) to select each tree species and then using the raster calculator I applied the algorithm to height model (Fig. 3). The singled out tree species was applied as a mask to the calculation. I repeated this process for the remaining 4 species groups. When all of the feature classes were created I used Mosaic to New Raster tool to merge all 5 species back to one complete raster of the study area.

(Fig. 2) Display in model builder of the work flow calculating the AGB for pine in the study area. 

(Fig. 3) Raster calculator displaying the equation used to calculate the AGB. 
Calculation of additional metrics

During this section of the lab I calculated the following:

  • Stem Biomass
  • Branch Biomass
  • Foliage Biomass 

To calculate each variation of the biomass I utilized an algorithm from Jenkins et al (2003) (Fig. 4). The DBH was calculated from the previous chart (Fig. 1) and the remaining parameter were taken from (Fig. 5) depending on the species type.


(Fig. 4) Equation to calculate stem, branch, and foliage biomass.
(Fig. 5) Parameters chart used in the calculation of stem, branch, and foliage biomass.
I completed the calculation in Excel to determine the coefficient which I multiplied against the original biomass calculation to determine each subset biomass. I used model builder again to apply the algorithm to each species class created in AGB portion of the lab (Fig. 6). Once all of the feature classes were created I created a new model which combined each species and each individual biomass in to one complete raster (Fig. 7).

(Fig. 6) Model displayed in model builder used to calculate stem, branch, and foliage biomass.
(Fig. 7) Model displayed in model builder combining each species and the type of biomass in to one complete raster.
Results


(Fig. 8) Canopy Height map broken in to 5 different categories. -12-0 (non vegetation), .1-1.6 (above ground not vegetation), 1.7-6.6 (low vegetation), 6.7-19.7 (medium vegetation), 19.8-328 (high vegetation).
(Fig. 9) Above ground biomass map for the study area.


(Fig. 10) Tree density per acre by height class.
(Fig. 11) Stem above ground biomass map.
(Fig. 12) Branch above ground biomass map.
(Fig. 13) Foliage above ground biomass map.
Discussion

The three maps displaying the stem, branch, and foliage are derived from the total biomass by a simple multiplication of the values. The scale at which these maps are displayed here does not show the variation between the types of biomass. There seems to be a slight correlation between tree height canopy height and AGB especially where the trees are the tallest in the study area.

Acknowledgements

I would like to thank Peter Sawall for his assistance throughout various steps of this lab. Check out Peter's blog.

Sources

Jenkins, J. C., Chojnacky, D. C., Heath, L. S., & Birdsey, R. A. (2003). National-scale biomass estimators for United States tree species. Forest Science, 49(1), 12-35.

Jenkins, Jennifer C.; Chojnacky, David C.; Heath, Linda S.; Birdsey, Richard A. 2004.     Comprehensive database of diameter-based biomass regressions for North American tree species.  Gen. Tech. Rep. NE-319. Newtown Square, PA: U.S. Department of Agriculture, Forest Service, Northeastern Research Station. 45 p.



Tuesday, November 22, 2016

Lab 7: Flood Inundation Modeling

Goals and Background

The purpose of this lab is utilize the LiDAR model created in the previous labs to model flood inundation in the study area. The lab will provide me with hands on experience illiterately modeling flood inundation based on various water heights in the study area.

Methods

Development of a simple flood inundation model

Before creating the inundation model I first have to create a water feature class which I will use in ArcScene to create the model. I used ArcMap to create a polygon feature class approximately the same spacial extent of the study area (Fig. 1).

(Fig. 1) Display of the created water feature in ArcMap.
Next, I opened the DTM I created in Lab 5 in ArcScene. After bringing in the DTM I had to set the Base Heights to Floating on a custom surface and the Factor was set to 2.0 along with changing coloration of the symbology (Fig. 2).

(Fig. 2) Display of the DTM with adjusted base heights and changed symbology.
Next I opened the created water feature in the ArcScene and set the base heights like the DTM but I set the Factor to 1 so it was below the DTM. Then I used the Animation Manager to create a flood animation. I created a new animation Keyframe and I set the Translation:Z to 10 feet below the DTM. I added 19 more Keyframes and increased the elevation by 5 foot increments. I had a few issues with my DTM so I had a hard time getting the exact elevation to be 10 feet below the surface. Then I was able to run the animation.

The animation wasn't perfect as water would come up in the middle of the high ground. To attempt to remedy the issue I edited the water feature in ArcMap and reshaped the feature class so there would be no water in the areas with the issue (Fig. 3).

(Fig. 3) Display of the edited water feature class in ArcMap.
Next I reset and ran the animation again.

Results







Sources

Lidar LAS data for portion of the Chippewa Valley is from The City of Eau Claire, WI. NAIP imagery is from United States Department of Agriculture Geospatial Data Gateway.

Tuesday, November 15, 2016

Lab 6: Topo-bathy Applications

Goals and Background

The main purpose of the lab is provide me with hands-on participation working with topo-bathy lidar. The lab will demonstrate how to perform basic QA/QC, generate and conflate shoreline breaklines, and enforcement of shoreline behavior when generating a topo-bathy lidar Digital Terrain Model (DTM) and hillshade products.

Production of extensive topo-bathy derivatives such as flood models are extremely dependent on high computing power. We do not have access to such computers here, so the products of this lab are just the tip of the iceberg of what can be done with the data.

Methods

Before starting a backup file of the data was created.

QA/QC

I will be using the windows version of LP 360 for the QA/QC portion of the lab.

I first opened the LAS dataset and NAIP imagery in LP360 (Fig. 1). After opening the LAS file and the NAIP imagery I inspected the data for errors made during the classification. The data was provided with the ground, and water classified. Numerous errors were noticed immediately upon inspection. In some areas there were multiple elevations levels classified as ground (Fig. 2). The water classification was classified as Class 31 (reserved) for some unknown reason. The largest issue for our lab purposes was many ground points were left unclassified (Fig. 3).

(Fig. 1) LAS files and NAIP imagery of study area opened in LP360.

(Fig. 2) Ground classification error displayed in the profile window. Note the various elevations of ground classified points.
(Fig. 3) Unclassified ground points displayed in the 3D viewer of LP360. Examine the window on the right and notice all of the gray points (unclassified) intermixed with classified ground points.
A fellow classmate Max came up with a effective plan to add help fill in the unclassified ground points. We utilized a height filter to classified other ground points at the same level (0 elevation) to a minute (.1 elevation) bit higher (Fig. 4).

(Fig. 4) Height filter parameters set in the point cloud task window.

Breakline creation

The next step of the lab was to create breaklines for both the shoreline edge and the land area. I utilized the same methods as Lab 5 to create 2 separate shapefiles (Fig. 5). The created shapefiles will be used in the next step of creating the DTM and the Hillshade. I was able to conflate the shoreline shapefile as I digitized it in the LP360 extension of ArcMap. The land area was not as kind during the digitization process. I ended up digitizing the land area as a 2D shapefile and conflating it in the windows version of LP360 following the same process as Lab 5.

(Fig. 5) Image displaying the shapefiles created for the shoreline and the land area.

Generation of topo-bathy seamless DTM

The final step of the lab was to produce a DTM and Hillshade for the study area. Utilizing the Export Wizard I set the parameters very similar as Lab 5. The parameters were set to the following:

  • Export Type: Surface
  • Source Points: 2 Ground, 11 Road surface, 31 Reserved (Water Points)
  • Surface Method: Triangulation (TIN)
  • Cell Edge Length: 2
  • Surface Attribute(s) to Export: Elevation and Hillshade
  • Export Format: Binary Raster
  • Breakline enforcement was set to utilize the 2 shapefiles created with the Elevation set to shape.
  • Perform-On-the-Fly Topology Corrections was checked.

Results

(Fig. 6) DTM result displayed in LP360. The extent is zoomed in to see the detail.



(Fig. 6) DTM result displayed in LP360 showing a few errors which resulted.

Discussion

Examining Fig. 6 you can see an error in the DSM. The breaklines which were created should have excluded this area but for some reason it hasn't. Upon close examination there are a few other areas which similar issues. All of these area are outside of the breakline. Further investigation is required to determine the source of the errors.

Sources

LAS, tile index, and metadata for Hiawatha National Forest coastal
area in Delta County, MI are from NOAA Office for Coastal Management. NAIP imagery is from

United States Department of Agriculture Geospatial Data Gateway.


Monday, November 7, 2016

Lab 5: Breakline Creation Conflation & Enforcement

Goals and Background

The purpose of the lab is to provide me with the hands-on skills required to create and enforce hard breaklines in Lidar data to produce high quality derivatives for various applications. During this lab I will create my own breaklines and examine them for topological issues. I will also be conflating breaklines while inspecting the integrity of the breakline elevation. The final step of the lab will have to enforcing the breakline constraints while also creating a digital terrain model (DTM) along with contours.

Methods

QA/QC of breaklines

I started this lab by performing QA/QC on the breaklines used in Lab 1 to classify the water. I opened the breaklines in LP360 with the NAIP imagery and inspected all of the breaklines, There was a few location issues but since the classification had been complete already I was instructed to not correct the issues. (Fig. 1)

(Fig. 1) Image displaying breakline error. You can see the breakline (blue polygon feature) extends to the road surface.


Conflation of breaklines

The created breaklines I was provided did not have a Z value (elevation) applied. Conflation determines the Z value for the breakline features. I opened the LAS, NAIP, and breakline files from the previous labs in LP360. I created a new Point Cloud Task. The task type was set to Conflation. The Tool Geometry  was set to SHP Layer and the input was the breakline shapefile from the previous labs. Source points were set to Ground and Water. Data Types had the following settings: Fields were set to WaterType (a field in the attributes of the shapefile of the breaklines),

First I set parameters for Pond or Lake and the Conflation Method was set to Summarize Z. I summarized the z values by computing one or more Z values for the input geometry as a whole. I selected Mean Z, Minimum Z, and Maximum Z. A distance of 5.0 map units was set to compute Z values. I unchecked Classify Points within Clased Lines.

Second I set the parameters for Island. The Conflation method was set to Drape and the descriptions was set to Pure Drape. Again I unchecked Classify Points within Closed Lines.

Third I set the parameters for River. I set the parameters exactly the same as I did for Island.


(Fig. 2) Point cloud task window displaying the parameter for the conflation of the ponds, lakes, rivers, island.

I executed the task the point cloud task which created a new shapefile with the Z values applied.

Z conflation of breaklines for wide rivers.

The following section was performed in the LP360 Extension for ArcMap.

Creating two separate shapefiles was the first step in this section of the lab. One shapefile was for the riverbank outline and the second was for the centerline of the river. Next, I started and editing session in ArcMap and created two Custom Conflation Tasks through the Conflate Task Manager on the Digitized Breaklines tool bar. The conflation method for the river centerline was set to Downstream Constraint to make sure the river decreased as it went downstream. The conflation method for the river outline was set to Drape. Digitizing the outline of the river was the next task. I had to make sure I was close to the river edge but not in the water. I was able to active the profile window and display by classification to make sure I was selecting the ground points. I then digitized the river centerline after I completed the outline. After I was completed I examine the shape of the digitization at the full extent of the viewer. (Fig. 3)

(Fig. 3) Image displaying the digitized outline and centerline of the river.
Hydro-flattening of pond and lakes

This section of the lab had me applying the conflated features to hydro-flatten the water features in the study area from the first 4 labs.

This section of the lab was performed in LP360.

After clicking the Export Lidar Data command I set the following parameters in Step 1.

  • Export type: Surface
  • Source Points: 2 Ground
  • Surface Method: Triangulation (TIN)
  • Cell Edge Length 3
  • Surface Attributes to Export: Elevation and Hillshade
  • Export format: Binary Raster
Within the Breakline Enforcement window I set the following parameters:
  • Check Use Breakline Enforcement box
  • Type Field: WaterType
  • Set Island, Ponds or Lakes, and River Elevation field to Shape
  • Checked the box to Perform-On-the-Fly Topology Corrections
  • Set Buffer Classes: 9 Water
I created a file name and generated the DTM image.

Extraction of contours

The next set of the lab was to create contours for a section of the study area from the first 4 labs. Creating contours for the entire study area was computationally demanding so we only did a small area.

Opening the Export Wizard again I set the same parameters in the first window as the Hydro-flattening section. I selected Contours from the Surface Attributes to Export instead of Elevation and Hillshade like the previous export. Once Contours was check I selected the Contour Tab and set the parameters as displayed in Fig. 4 in the General tab. The parameters I used in the Annotation tab are displayed in Fig. 5. In Step 2 I selected Draw Window in Map and used the Draw Window Tool to select an area to generate the contours in. 

(Fig. 4) General tab parameters under the Contour tab in the Export Wizard.

(Fig. 5) Annotation tab parameters under the Contour tab in the Export Wizard.

River hydro-flattening and downstream constraint

The following methods were performed in the LP360 Extension for ArcMap.

The purpose of this section is to use the river shapefiles created in the previous segment to hydro-flatten the river feature.

The process was the same as hydro-flattening the ponds and lakes. Using the Export Wizard I set the parameters the same way but utilized the conflated shapefiles I created.

Results



(Fig. 6) Display of properly hydro-flattened lake in the DTM/Hillshade combination.


(Fig. 7) Display of a lake which was not correctly hydro-flattened due to in correct island digitization and/or labeling.

(Fig. 8) Display of a portion of the hydro-flattened river in the DTM/Hillshade file.

Discussion

In Fig. 7 and Fig. 8 you can see some imperfections in the results. The errors displayed in Fig. 7 are the result of an island in the middle of the water feature. The islands have to be correctly digitized and labeled for an effective hydro-flattened image. The way it is displayed the program is trying to flatten the island instead of excluding it. In Fig. 8 you can see some horizontal lines on the river. After talking with my professor the cause of this is related to my digitization. When I digitized the shoreline of the river I selected areas to high on the river bank which have a higher elevation causing errors in the hydro-flattening. You can see in the point-bar side of the river the hydro-flattening was very effective as the shoreline was very flat with a low slope.

Sources

 LAS, tile index, and metadata for Lake County are from Illinois Geospatial Data ClearingHouse. NAIP imagery is from United States Department of Agriculture Geospatial Data Gateway. Breaklines is from Chicago Metropolitan Agency for Planning. Lidar LAS data for portion of the Chippewa Valley is from The City of Eau Claire, WI.


Tuesday, October 25, 2016

Lab 4: Quality Assurance & Quality Control (QA/QC)

Goals and Background

The purpose of this lab is to gain hands on experience evaluating processed Lidar data across various types of accuracy. Lidar data is utilized for highly sensitive projects where accuracy is critical to meet the expectations of the project needs. Throughout the lab I will be evaluating the vertical, horizontal, and classification accuracy of the data from Labs 1 through 3.

Methods

Unless stated all of the following procedures were preformed in LP 360.

The data I will assessing the accuracy of is the same data which I classified in Labs 1-3.

As always a backup copy of the data was made before starting any of the following processes.
.
Point Cloud Density Analysis

During this part of the lab I will assess the density of points across the study area to determine if there are any areas which do not meet the proper requirements. Should any not meet the requirements the area should be noted and labeled as Low Confidence.

I utilized the Stamp Tool to extract statistics for small areas to look at the variation of the point density. The Stamp Tool however only covers a small area and would take a large amount of time to cover the entire study area.

Instead I used the Export Lidar Data button to open the LP360 Export Wizard to create a display of the point density across the entire study area. With the Export Wizard open I utilized information from the statistics I calculated in Lab 1 and parameters defined by my professor to extract the point density for the study area.

Step 1 of the Export Wizard I set the following parameters (Fig. 1-3):

  • Export Type to Surface
  • Surface Method to Point Insertion (PI)
  • Cell Edge Length set to 5.88 (2 * NPS) NPS taken from statistic. 
  • Surface Attributes to Export checked Density box
  • Source Points were set to First Returns

(Fig. 1) Export wizard with Export Type, Surface Method, Cell Edge Length, Surface Attributes to Export, and Sources Points set to proper parameters.






  • Scan Angle set to a minimum -13.5 to a maximum of 13.5 (Fig. 2)

  • (Fig. 2) Scan Angle set in the Export Wizard window.


    • Interval set to 4
    • Point Density set to .11 (Take from statistics)
    • Units set to feet


    (Fig. 3) Export Wizard with Interval, Point Density, and Units set to the correct parameters.
    Point Cloud Spatial Distribution Analysis

    During this section of the lab I will be determining if there is at least one Lidar point per cell. I will be using Model Builder with in Arc Catalog to perform the analysis.

    Utilizing the Result from Point Density Analysis above I used Raster Calculator to subtract all three bands from the from one another to create my output image (Fig. 4).

    (Fig. 4) Model Builder in Arc Catalog displaying the calculation used to create the Spatial Distribution of points.

    Relative Accuracy Assessment

    Creating and analyzing a DZ ortho image

    A swath to swath analysis of the Lidar data is one accuracy assessment which needs to be completed. The swath to swath analysis inspects the alignment of the Lidar data where the flight lines overlap.

    The first step is to create a DZ ortho image to preform the analysis on. I activated the QA/QC Toolbar in LP 360 and then added the LAS and NAIP imagery from the previous labs. I set the Point Filter to display First Return values only. Next I set the Legend Type to Display by Elevation Difference.

    Now I used the Export Lidar Data icon to open the Export Wizard. I set the following parameters per the guidance of my professor (Fig. 5-6):


    • Export Type to Surface
    • Source Points to All Returns
    • Surface Method to Point Insertion (PI)
    • Cell Edge Length to 10
    • Surface Attributes selected dZ Images


    (Fig. 5) Export Wizard window with parameters set in the Surface tab.


    • Intervals set to 5
    • Interval Size set to .04



    (Fig. 6) Export Wizard window with parameters set in the Dz tab.
    Swath to Swath Analysis

    The following method was performed in LP360 for ArcGis as the windows version of LP360 does not have the capability.

    I will be performing the analysis in non-vegetated areas which are flat.

    I opened the resulting Dz image in ArcMap along with the NAIP imagery. Next I created a polyline feature class with the same coordinates of the imagery. Then I used the profile tool in the Windows version of LP360 to compare areas which I presumed were flat to double check. Then I created a line on both sides of the swath line (Fig. 7). I continued this process throughout all of the swath lines where there were non-vegetated flat areas.

    (Fig. 7) Polyline feature class created on the edge of the swath line in ArcMap.
    The next step was to use the Seamline Analysis tool on the LP360 QA/QC toolbar to perform the analysis. The parameters were set to the following

    • Sampel Distance set to 5
    • Search Radius set to 1
    • Check the box to Omit no-datas from Outputs
    • Modify point filter to Use class 2 Ground
    Then I displayed the results in ArcMap using Graduated Symbols (Fig. 8). Using the legend created I am looking for any symbols which are red or blue as those fall outside of the acceptable range. I only had two green circles which are still in the acceptable range (Fig. 9).

    (Fig. 8) Settings for displaying the Graduated Symbols for my Swath to Swath Analysis.

    (Fig. 9) Green graduated symbol displayed on the edge of a swath line.

    Absolute Accuracy Assessment

    Non-vegetated vertical accuracy

    The first step in checking the vertical accuracy is to create a shapefile from the GPS locations provided to me. I utilized ArcMap to perform the task.

    Then I added the shapefile to LP360 which displayed the points in my study area (Fig. 9).

    (Fig. 9) Vertical GCP shapefile displayed by light blue dots in LP360.
    Next I set the Control Points to the vertical GCP shapefile I created in the previous step. Then I set the Elevation Field to Shape. After opening the Control Points Report Dialog I set the Source Points to Ground class, the Interpolation Method was set to Triangulation (TIN), and the Z Probe Location was set to Control XY (Fig. 10). Then I clicked the Calculate DZ button to run the calculation.

    (Fig. 10) Vertical accuracy calculation in LP360.

    Horizontal Accuracy 

    I created another shapefile based on the Horizontal Accuracy GPS points I was provided. I then opened the shapefile in LP360. I then changed the Control Points to the Horizontal GCP shapefile I created. I calculated the DZ in the same method as the Vertical Accuracy above except the Source Points were set to All Points. There is no horizontal accuracy because there is not a measure point to compare too. In real world cases there would be visual X's on the NAIP imagery to compare the location of but for our class purpose we do not have any. I selected the center of the point for each GCP location (Fig. 11). Once enough points are selected the accuracy tables in the bottom are completed and updated as you add points.

    (Fig. 11) Measuring for horizontal accuracy in LP360. Green + symbol is the selected location for the Horizontal GCP based on the location of the original point.

    Manual QA/QC of Classification Errors

    Identifying classification errors

    The first step in Manual QA/QC is to identify errors within the classification. Utilizing the Profile and 3D view in LP360 along with the NAIP imagery I examined the classification I completed in the previous labs for errors. Once I located an error I created a QA/QC Shapefile which allowed me to digitize around the area and label with a description (Fig. 12). There are more errors in the classification then time will allow to identify. I was assigned to identify 20 error locations. I used the same method to digitize the remain 19 error locations as I located them.

    (Fig. 12) Digitizing error location in LP360 with descriptions displayed in the Attribute Editor.

    Fixing classification errors

    To correct the identified errors I used the same method from Lab 3 when I performed the manual cleanup. Additionally, I used the Classify by Paint Brush Tools in the Classification Tool Bar. These tools work the same but from the aerial view instead of the profile view. You have to set the Destination Class and the Source points the same way.

    (Fig. 13) Inspecting an error location where a few building points have been classified as vegetation.

    (Fig. 14) Display of fixed of classification error. Center building cleaned up of erroneous vegetation classification.


    Results

    The point cloud density result displays the highest density areas in the bright green (Fig.15.) The areas in orange areas which have lower point density. The numerous areas in black are water which have few to no points due to water absorption of the Lidar pulse.


    (Fig. 15) Result of the Point Cloud Density calculation.

    The point cloud spatial distribution result is displayed below (Fig. 16). The dark gray areas meet the specification for the distribution. The areas in green (not the bright green lake areas) are areas which do not meet the specification for the spatial distribution.
    (Fig. 16) Point cloud spatial distribution result.
    Below is the Dz ortho image I created to perform the swath to swath analysis (Fig. 17). The image displays the area of the study area where the flight lines overlapped.


    (Fig. 17) Dz Ortho image used in the swath to swath analysis. 
    Sources

    LAS, tile index, and metadata for Lake County are from Illinois Geospatial Data ClearingHouse. NAIP imagery is from United States Department of Agriculture Geospatial Data Gateway. Breaklines is from Chicago Metropolitan Agency for Planning.