EAS 648 Lab Assignment 2

Links:

Course Repository: https://github.com/HumblePasty/EAS648

Lab02 Webpage: https://humblepasty.github.io/EAS648/Lab/02/

Lab02 Repository: https://github.com/HumblePasty/EAS648/tree/master/Lab/02

Task Description:

Develop a research methodology guided by a LiDAR dataset (the one provided or data found from the list below) aimed at exploring its potential applications and insights. Reflect on the perspectives and revelations drawn from your visualization of this data.

Critically analyze the advantages and complexities inherent in the utilization of 3D geovisualization. Additionally, provide an assessment of the trajectory of 3D technology, including immersive environments, within the academic and professional landscape. (Maximum 4 paragraphs)

Part I: LiDAR Research Methodology

Task:

Develop a research methodology guided by a LiDAR dataset (the one provided or data found from the list below) aimed at exploring its potential applications and insights. Reflect on the perspectives and revelations drawn from your visualization of this data.

1. Research Question

Description:

In this section I created a research methodology as required for change detection using LiDAR data.

2. Data Source

The LiDAR data used for this research methodology example came from NOAA Digital Coast (https://coast.noaa.gov/digitalcoast/data/)

The following implemented example of change detection is based on LiDAR data collected over Washington DC in year 2020 and year 2022. The two dataset used is:

The georeferencing on the files refers to generic NAD83 (EPSG code 4269). All files are in a more recent realization of NAD83 such as NAD83(CORS96), NAD83(HARN), NAD83(NSRS2007), or NAD83(2011).

3. Implementation

# Requiring the library
## Dependencies
library(sf)
## Linking to GEOS 3.11.2, GDAL 3.7.2, PROJ 9.3.0; sf_use_s2() is TRUE
library(raster)
## 载入需要的程辑包:sp
library(terra)
## terra 1.7.55
library(raster)
library(tmap)
## Breaking News: tmap 3.x is retiring. Please test v4, e.g. with
## remotes::install_github('r-tmap/tmap')
library(tmaptools)
library(lidR)
## 
## 载入程辑包:'lidR'
## The following object is masked from 'package:terra':
## 
##     is.empty
## The following objects are masked from 'package:raster':
## 
##     projection, projection<-
## The following object is masked from 'package:sf':
## 
##     st_concave_hull
#library(RStoolbox)
library(ForestTools)
library(ggplot2)
library(gstat)

library(leaflet)

library(tmap)

3.1 Showng the Research Area

# loading the tile index for 2020
lidar_index_2020 <- st_read("Data/tileindex_2020/dc2020_dc_index.shp")
## Reading layer `dc2020_dc_index' from data source 
##   `D:\UMich\Fall23\EAS648\Repo\Lab\02\Data\tileindex_2020\dc2020_dc_index.shp' 
##   using driver `ESRI Shapefile'
## Simple feature collection with 328 features and 3 fields
## Geometry type: POLYGON
## Dimension:     XY
## Bounding box:  xmin: -77.12227 ymin: 38.78554 xmax: -76.90087 ymax: 39.00174
## Geodetic CRS:  NAD83(NSRS2007)
lidar_index_2020 <- st_transform(lidar_index_2020, 4326)

# loading the tile index for 2022
lidar_index_2022 <- st_read("Data/tileindex_2022/dc2022_wash_dc_index.shp")
## Reading layer `dc2022_wash_dc_index' from data source 
##   `D:\UMich\Fall23\EAS648\Repo\Lab\02\Data\tileindex_2022\dc2022_wash_dc_index.shp' 
##   using driver `ESRI Shapefile'
## Simple feature collection with 328 features and 3 fields
## Geometry type: POLYGON
## Dimension:     XY
## Bounding box:  xmin: -77.12227 ymin: 38.78554 xmax: -76.90087 ymax: 39.00174
## Geodetic CRS:  NAD83(NSRS2007)
lidar_index_2022 <- st_transform(lidar_index_2020, 4326)

# showing the index layer for 2020
par(mfrow = c(1, 2))
leaflet(lidar_index_2020) %>%
  addTiles() %>%
  addPolygons(color = "#444444", weight = 1, smoothFactor = 0.5,
    opacity = 1.0, fillOpacity = 0.05,
    fillColor =NA,
    highlightOptions = highlightOptions(color = "blue", weight = 2,
      bringToFront = TRUE),
            popup = paste("LiDAR Index: ", lidar_index_2020$Index))
# showing the index layer for 2022
leaflet(lidar_index_2022) %>%
  addTiles() %>%
  addPolygons(color = "#444444", weight = 1, smoothFactor = 0.5,
    opacity = 1.0, fillOpacity = 0.05,
    fillColor =NA,
    highlightOptions = highlightOptions(color = "blue", weight = 2,
      bringToFront = TRUE),
            popup = paste("LiDAR Index: ", lidar_index_2022$Index))

The Research Tile

las_index_tile = lidar_index_2020[lidar_index_2020$Index == 89, ]
leaflet(las_index_tile)%>%
  addTiles() %>%
  addPolygons(color = "#444444", weight = 1, smoothFactor = 0.5,
    opacity = 1.0, fillOpacity = 0.3,
    fillColor ="green",
    highlightOptions = highlightOptions(color = "blue", weight = 2,
      bringToFront = TRUE),
            popup = paste("Research Tile Index: ", las_index_tile$Index))

3.2 Loading and Adjusting the LAZ Data

# load laz file
# las_2020 = readLAS("Data/20200630_1219.laz")
# las_2022 = readLAS("Data/20220124_1219.copc.laz")

las_2020_new = readLAS("Data/AnotherTile/20220124_1914.copc.laz")
## Warning: Point-clouds with lon/lat coordinates are not well supported.
las_2022_new = readLAS("Data/AnotherTile/20200630_1914.laz")
## Warning: Point-clouds with lon/lat coordinates are not well supported.
# No need to set coordinate reference systems (CRS)
# EPSG code is provided in the metadata: 4269
# CRS("+init=epsg:4269")
# epsg(las_2020) = 4269
summary(las_2020_new)
## class        : LAS (v1.4 format 6)
## memory       : 124.9 Mb 
## extent       : -77.04841, -77.03918, 38.87922, 38.88643 (xmin, xmax, ymin, ymax)
## coord. ref.  : NAD83(2011); NAVD88 height 
## area         : 0.62 kunits²
## points       : 1.72 million points
## density      : 2.8 points/units²
## density      : 2.54 pulses/units²
## File signature:           LASF 
## File source ID:           0 
## Global encoding:
##  - GPS Time Type: Standard GPS Time 
##  - Synthetic Return Numbers: no 
##  - Well Know Text: CRS is WKT 
##  - Aggregate Model: false 
## Project ID - GUID:        00000000-0000-0000-0000-000000000000 
## Version:                  1.4
## System identifier:        NOAA OCM 
## Generating software:      datum_shift (Version: v1.0-2743 
## File creation d/y:        130/2022
## header size:              375 
## Offset to point data:     1211 
## Num. var. length record:  2 
## Point data format:        6 
## Point data record length: 30 
## Num. of point records:    1723705 
## Num. of points by return: 1565243 149279 9027 155 1 0 0 0 0 0 0 0 0 0 0 
## Scale factor X Y Z:       1e-07 1e-07 0.01 
## Offset X Y Z:             -77 38 0 
## min X Y Z:                -77.04841 38.87922 0.1 
## max X Y Z:                -77.03918 38.88643 32.36 
## Variable Length Records (VLR):
##    Variable Length Record 1 of 2 
##        Description: COPC info VLR 
##    Variable Length Record 2 of 2 
##        Description:  
##        WKT OGC COORDINATE SYSTEM: COMPD_CS["NAD83(2011); NAVD88 height",GEOGCS["NAD83(2011)",DATUM["NAD8 [...] (truncated)
## Extended Variable Length Records (EVLR):
##    Extended Variable length record 1 of 2 
##        Description: EPT Hierarchy 
##    Extended Variable length record 2 of 2 
##        Description:
summary(las_2022_new)
## class        : LAS (v1.4 format 6)
## memory       : 256.1 Mb 
## extent       : -77.04841, -77.03918, 38.87922, 38.88643 (xmin, xmax, ymin, ymax)
## coord. ref.  : NAD83(2011); NAVD88 height 
## area         : 0.64 kunits²
## points       : 3.36 million points
## density      : 5.25 points/units²
## density      : 4.75 pulses/units²
## File signature:           LASF 
## File source ID:           0 
## Global encoding:
##  - GPS Time Type: Standard GPS Time 
##  - Synthetic Return Numbers: no 
##  - Well Know Text: CRS is WKT 
##  - Aggregate Model: false 
## Project ID - GUID:        00000000-0000-0000-0000-000000000000 
## Version:                  1.4
## System identifier:        NOAA OCM 
## Generating software:      datum_shift (Version: v1.0-328- 
## File creation d/y:        280/2020
## header size:              375 
## Offset to point data:     5673 
## Num. var. length record:  2 
## Point data format:        6 
## Point data record length: 30 
## Num. of point records:    3356610 
## Num. of points by return: 3040163 274150 40388 1875 34 0 0 0 0 0 0 0 0 0 0 
## Scale factor X Y Z:       1e-07 1e-07 0.01 
## Offset X Y Z:             -77 38 0 
## min X Y Z:                -77.04841 38.87922 -0.41 
## max X Y Z:                -77.03918 38.88643 32.56 
## Variable Length Records (VLR):
##    Variable Length Record 1 of 2 
##        Description: Classification Lookup 
##    Variable Length Record 2 of 2 
##        Description: by LAStools of rapidlasso GmbH 
##        WKT OGC COORDINATE SYSTEM: COMPD_CS["NAD83(2011); NAVD88 height",GEOGCS["NAD83(2011)",DATUM["NAD8 [...] (truncated)
## Extended Variable Length Records (EVLR):  void

Visualize

# plot(las_2020)
# plot(las_2022)

The original laz data is stored in logitude and latitude, thus the data should be projected.

# Here I use UTM projection in zone 17
# utm_crs <- st_crs(paste0("+proj=utm +zone=17 +datum=WGS84 +units=m +no_defs"))

# project the las data
epsg_nad83_utm17n <- 26917
temp_las = las_2020_new@data
temp_coords <- as.data.frame(temp_las)
temp_coords <- st_as_sf(temp_coords, coords = c("X", "Y"), crs = 4269)

# Transform
coords_utm <- st_transform(temp_coords, crs = epsg_nad83_utm17n)
# coords_utm <- as.data.frame(coords_utm)
xy_coords <- st_coordinates(coords_utm)
head(xy_coords)
##             X       Y
## [1,] 842785.1 4311523
## [2,] 842785.8 4311521
## [3,] 842791.8 4311525
## [4,] 842786.1 4311520
## [5,] 842793.1 4311524
## [6,] 842792.6 4311524
# overwrite the x and y column in las data
las_2020_new@data$X <- xy_coords[,"X"]
las_2020_new@data$Y <- xy_coords[,"Y"]
head(las_2020_new@data)
##           X       Y     Z   gpstime Intensity ReturnNumber NumberOfReturns
## 1: 842785.1 4311523 21.15 327089562      1674            1               2
## 2: 842785.8 4311521 20.36 327089562      3999            1               2
## 3: 842791.8 4311525 20.35 327089562      2511            1               2
## 4: 842786.1 4311520 21.81 327089562      3162            1               1
## 5: 842793.1 4311524 21.52 327089562      2418            1               2
## 6: 842792.6 4311524 21.23 327089562      3348            1               2
##    ScanDirectionFlag EdgeOfFlightline Classification ScannerChannel
## 1:                 1                0              5              0
## 2:                 1                0              5              0
## 3:                 0                0              5              0
## 4:                 1                0              5              0
## 5:                 0                0              5              0
## 6:                 0                0              5              0
##    Synthetic_flag Keypoint_flag Withheld_flag Overlap_flag ScanAngle UserData
## 1:          FALSE         FALSE         FALSE        FALSE     0.996        1
## 2:          FALSE         FALSE         FALSE        FALSE     0.996        1
## 3:          FALSE         FALSE         FALSE        FALSE     0.996        1
## 4:          FALSE         FALSE         FALSE        FALSE     0.996        1
## 5:          FALSE         FALSE         FALSE        FALSE     0.996        1
## 6:          FALSE         FALSE         FALSE        FALSE     0.996        1
##    PointSourceID
## 1:            11
## 2:            11
## 3:            11
## 4:            11
## 5:            11
## 6:            11
# plot(las_2020_new, color = "Z")
### Repeat for the 2022 data
temp_las = las_2022_new@data
temp_coords <- as.data.frame(temp_las)
temp_coords <- st_as_sf(temp_coords, coords = c("X", "Y"), crs = 4269)

# Transform
coords_utm <- st_transform(temp_coords, crs = epsg_nad83_utm17n)
# coords_utm <- as.data.frame(coords_utm)
xy_coords <- st_coordinates(coords_utm)
head(xy_coords)
##             X       Y
## [1,] 843332.6 4311625
## [2,] 843332.1 4311624
## [3,] 843333.0 4311625
## [4,] 843335.4 4311625
## [5,] 843334.7 4311624
## [6,] 843334.8 4311625
# overwrite the x and y column in las data
las_2022_new@data$X <- xy_coords[,"X"]
las_2022_new@data$Y <- xy_coords[,"Y"]
head(las_2022_new@data)
##           X       Y    Z   gpstime Intensity ReturnNumber NumberOfReturns
## 1: 843332.6 4311625 8.57 277566029     15840            1               2
## 2: 843332.1 4311624 1.09 277566029      5445            2               2
## 3: 843333.0 4311625 1.33 277566029      3267            3               3
## 4: 843335.4 4311625 1.41 277566029      4752            2               2
## 5: 843334.7 4311624 1.24 277566029      3762            2               2
## 6: 843334.8 4311625 8.45 277566029     17028            1               2
##    ScanDirectionFlag EdgeOfFlightline Classification ScannerChannel
## 1:                 1                0              5              0
## 2:                 1                0              2              0
## 3:                 1                0              2              0
## 4:                 0                0              2              0
## 5:                 0                0              2              0
## 6:                 0                0              5              0
##    Synthetic_flag Keypoint_flag Withheld_flag Overlap_flag ScanAngle UserData
## 1:          FALSE         FALSE         FALSE         TRUE         9        1
## 2:          FALSE         FALSE         FALSE         TRUE         9        1
## 3:          FALSE         FALSE         FALSE         TRUE         9        1
## 4:          FALSE         FALSE         FALSE         TRUE         9        1
## 5:          FALSE         FALSE         FALSE         TRUE         9        1
## 6:          FALSE         FALSE         FALSE         TRUE         9        1
##    PointSourceID
## 1:            14
## 2:            14
## 3:            14
## 4:            14
## 5:            14
## 6:            14
# plot(las_2022_new, color = "Z")

Select the water part of the data

# water_2020 = filter_poi(las_2020, Classification == 9L)
# water_2022 = filter_poi(las_2022, Classification == 9L)

water_2020 = filter_poi(las_2020_new, Classification == 9L)
## Warning: Point-clouds with lon/lat coordinates are not well supported.
water_2022 = filter_poi(las_2022_new, Classification == 9L)
## Warning: Point-clouds with lon/lat coordinates are not well supported.
head(water_2020@data)
##           X       Y    Z   gpstime Intensity ReturnNumber NumberOfReturns
## 1: 843569.8 4311406 0.34 327089037      2232            1               1
## 2: 843355.4 4311614 0.34 327089041      1953            1               1
## 3: 843344.5 4311615 0.35 327089041      2139            1               1
## 4: 842787.9 4311421 0.28 327089563      3720            1               1
## 5: 842787.6 4311420 0.27 327089563      3255            1               1
## 6: 842788.6 4311420 0.24 327089563     13392            1               1
##    ScanDirectionFlag EdgeOfFlightline Classification ScannerChannel
## 1:                 0                0              9              0
## 2:                 1                0              9              0
## 3:                 0                0              9              0
## 4:                 0                0              9              0
## 5:                 1                0              9              0
## 6:                 0                0              9              0
##    Synthetic_flag Keypoint_flag Withheld_flag Overlap_flag ScanAngle UserData
## 1:          FALSE         FALSE         FALSE        FALSE   -13.998        1
## 2:          FALSE         FALSE         FALSE        FALSE   -13.998        1
## 3:          FALSE         FALSE         FALSE        FALSE   -13.998        1
## 4:          FALSE         FALSE         FALSE        FALSE     3.000        1
## 5:          FALSE         FALSE         FALSE        FALSE     3.000        1
## 6:          FALSE         FALSE         FALSE        FALSE     3.000        1
##    PointSourceID
## 1:            10
## 2:            10
## 3:            10
## 4:            11
## 5:            11
## 6:            11
# remove the unused variables
# rm(las_2020_new, las_2022_new, temp_coords, temp_las, xy_coords, coords_utm)
ls()
##  [1] "coords_utm"        "epsg_nad83_utm17n" "las_2020_new"     
##  [4] "las_2022_new"      "las_index_tile"    "lidar_index_2020" 
##  [7] "lidar_index_2022"  "temp_coords"       "temp_las"         
## [10] "water_2020"        "water_2022"        "xy_coords"

3.3 Constructing DSM Model and Comparing

ERROR unresolved: Cannot create DTM / DSM model with the following error:

! Loop 0 is not valid: Edge 29 crosses edge 31

Note: I debuged for a whole day and cannot solve the problem successfully. This error possibly came from projection. The original data is stored in latitude and longitude. After projection the structure of the points may be distorted, causing crossing error. I’m stull looking for solution.

I continued to implement the rest of the method with the assumed normal result. To export HTML correctly, I commented the rest of the code.

# Construct DTM with KNN, TIN or kriging, errors
# dtm_2020 = grid_terrain(las_2020_new, res = 1, knnidw(k = 6, p = 2), keep_lowest = FALSE)
# dtm_2020 <- grid_terrain(las_2020_new, res = 3, algorithm = tin())
# dtm_2020 <- grid_terrain(las_2020_new, res = 16.4042, algorithm = kriging(k = 40))

# Constructing DSM
# plot(las_2020_new)
# water_dsm_2020 <- grid_canopy(las_2020_new, res = 3, pitfree(c(0,2,5,10,15)))
# water_dsm_2022 <- grid_canopy(water_2022, res = 5, pitfree(c(0,5,5,10,15)))
# 
# ## plot 2d
# dtm_2020_df = as.data.frame(dtm_2020)
# ggplot(dtm_2020_df) +
#     geom_histogram(aes(Z)) +
#     xlab("DTM Elevation Value (m)") +
#     ggtitle("Distribution of DTM Values")

3.5 Compare the Two Result

# change_detection <- raster::overlay(dtm_2020, dtm_2022, fun = function(x, y) { return(y - x) })
# 
# Visualize the changes
# plot(change_detection)

4. Discussion

4.1 Analysing of the Result

Due to unresolved bug, I was not able to see the change detection directly. Here I inserted two screencap of water LiDAR point in 2020 and 2022 to compare directly.

Water 2020
Water 2020
Water 2022
Water 2022

4.2 Possible Application

The above method provides a simple example of how change detection based on LiDAR data can be used on water body. On a larger scale, change detection based on LiDAR can also be used for:

  1. Measuring Elevation Changes: It can detect subtle changes in elevation and topography around water bodies, which is crucial for understanding erosion, sediment deposition, and the overall geomorphology of a watershed.
  2. Volume and Depth Analysis: By creating accurate 3D models of a water body’s basin, LiDAR data can be used to calculate volume and depth, which are vital for water resource management and modeling water flow.
  3. Coastline Management: LiDAR assists in monitoring coastline changes due to natural processes like erosion or human activities, which is crucial for coastal management and mitigation planning.

Part II: Descriptive Questions

1. Advantages and Complexities of 3D geovisualization

Question:

Critically analyze the advantages and complexities inherent in the utilization of 3D geovisualization.

Answer:

Advantages:

  1. Enhanced Spatial Understanding:
    • 3D geovisualization provides a more intuitive way to interpret and navigate spatial data. It allows for a realistic representation of geographical areas, which can be crucial for urban planning, environmental management, and navigation.
  2. Improved Data Integration:
    • This technology enables the integration of various types of data into a single model. This can include topographical, infrastructural, and demographic data, improving the ability to make comprehensive assessments.
  3. Interactive Analysis:
    • Users can interact with the 3D models, exploring different layers and aspects of the data. This facilitates a deeper analysis and can lead to better decision-making.
  4. Effective Communication Tool:
    • 3D visualizations are highly effective for communicating complex spatial information to a non-technical audience, making it easier to visualize potential changes in the landscape or infrastructure projects.
  5. Simulation and Forecasting:
    • It allows for the simulation of environmental and urban changes, aiding in forecasting and planning for future developments or natural disasters.

Complexities:

  1. Technical Requirements:
    • 3D geovisualization software often requires high-performance computing resources and specialized knowledge to operate effectively, limiting access to well-funded organizations and individuals with technical expertise.
  2. Data Accuracy and Quality:
    • The accuracy of a 3D geovisualization is heavily dependent on the quality of the input data. Inaccurate or low-resolution data can lead to misleading representations.
  3. Overwhelming Information:
    • The amount of data that can be displayed in a 3D model can be overwhelming, potentially leading to confusion or misinterpretation if not well managed.
  4. Interoperability Issues:
    • There can be challenges in integrating data from different sources or software, which may use different standards and formats.
  5. Legal and Ethical Concerns:
    • Privacy issues arise when dealing with high-resolution geospatial data, especially in urban areas where individuals may be identifiable.
  6. Maintenance and Updates:
    • Keeping the 3D models updated with the latest data can be labor-intensive and costly.
  7. Cognitive Load:
    • Users may experience a high cognitive load as they need to understand complex interfaces and interpret dense spatial information.

2. Assessment of the Trajectory of 3D Technology

Question:

Provide an assessment of the trajectory of 3D technology, including immersive environments, within the academic and professional landscape. (Maximum 4 paragraphs)

Answer:

In the GIS and geospatial sectors, 3D technology is on a trajectory toward becoming a standard tool for visualization and analysis. In academia, its use is deepening the understanding of geographic phenomena, allowing for simulations of environmental changes over time and providing students with a tangible view of theoretical concepts. The trajectory in education points to a more immersive learning experience where 3D and virtual reality (VR) technologies enable students to explore complex geospatial scenarios in an interactive manner.

In the professional world, 3D GIS is increasingly critical for urban and regional planning, natural resources management, and in the execution of large-scale infrastructure projects. The technology allows for a comprehensive view of geographic data, including the z-axis (elevation), which is crucial for applications like flood risk assessment, line-of-sight analysis, and landscape modeling. Professionals are also using 3D geospatial data in conjunction with VR to create immersive simulations for better stakeholder engagement and decision-making support.

The trajectory has seen a shift from static 3D models to dynamic systems that can handle real-time data streaming and user interaction. This evolution is partly due to advancements in hardware that can render complex models and software that can process large datasets more efficiently. These developments are converging with trends in big data and machine learning, providing predictive analytics and smarter modeling tools.

Challenges such as data interoperability, accuracy, and the management of large datasets persist. However, the future trajectory seems focused on overcoming these through better standards and more powerful analytical tools. The integration of 3D geospatial data with emerging technologies such as the Internet of Things (IoT) and AI indicates a trend toward more automated, real-time geospatial analysis environments. This progress is expected to further embed 3D geovisualization into the fabric of both academic research and professional practice, leading to more sophisticated and accessible geospatial analysis tools.