Processing Drone LiDAR from YellowScan’s Surveyor, a Velodyne Puck based System

Points clouds from UAVs have become a common sight. Cheap consumer drones equipped with cameras produce points from images with increasing quality as photogrammetry software is improving. But vegetation is always a show stopper for point clouds generated from imagery data. Only an active sensing technique such as laser scanning can penetrate through the vegetation and generate points on the ground under the canopy of a forested area. Advances in UAV technology and the miniaturization of LiDAR systems have allowed lasers-scanning solutions for drones to enter the market.

Last summer we attended the LiDAR for Drone 2017 Conference by YellowScan and processed some data sets flown with their Surveyor system that is built around the Velodyne VLP-16 Puck LiDAR scanner and the Applanix APX15 single board GNSS-Inertial solution. One common challenge observed in LiDAR data generated by the Velodyne Puck is that surfaces are not as “crisp” as those generated by other laser scanners. Flat and open terrain surfaces are described by a layer of points with a “thickness” of a few centimeter as you can see in the images below. This visualization uses a 10 meter by 5 meter cut-out of from this data set with the coordinate range [774280,774290) in x and [6279463,6279468) in y. Standard ground classification routines will “latch onto” the lowermost envelope of these thick point layers and therefore produce a sub-optimal Digital Terrain Model (DTM).

In part this “thickness” can be reduced by using fewer flightlines as the “thickness” of each flightline by itself is lower but it is compounded when merging all flightlines together. However, deciding which (subset of) flightlines to use for which part of the scene to generate the best possible ground model is not an obvious tasks either and even per flightline there will be a remaining “thickness” to deal with as can be seen in the following set of images.

In the following we show how to deal with “thickness” in a layer of points describing a ground surface. We first produce a “lowest ground” which we then widen into a “thick ground” from which we then derive “median ground” points that create a plausible terrain representation when interpolated by a Delaunay triangulation and rasterized onto a DTM. Step by step we process this example data set captured in a “live demo” during the LiDAR for Drone 2017 Conference – the beautiful Château de Flaugergues in Montpellier, France where the event took place. You can download this data via this link if you would like to repeat these processing steps:

Once you decompress the RAR file (e.g. with the UnRar.exe freeware) you will find six raw flight strips in LAS format and the trajectory of the UAV in ASCII text format as it was provided by YellowScan.

E:\LAStools\bin>dir Flaugergues
06/27/2017 08:03 PM 146,503,985 Flaugergues_test_demo_ppk_L1.las
06/27/2017 08:02 PM  91,503,103 Flaugergues_test_demo_ppk_L2.las
06/27/2017 08:03 PM 131,917,917 Flaugergues_test_demo_ppk_L3.las
06/27/2017 08:03 PM 219,736,585 Flaugergues_test_demo_ppk_L4.las
06/27/2017 08:02 PM 107,705,667 Flaugergues_test_demo_ppk_L5.las
06/27/2017 08:02 PM  74,373,053 Flaugergues_test_demo_ppk_L6.las
06/27/2017 08:03 PM   7,263,670 Flaugergues_test_demo_ppk_traj.txt

As usually we start with quality checking by visual inspection with lasview and by creating a textual report with lasinfo.

E:\LAStools\bin>lasview Flaugergues_test_demo_ppk_L1.las

The raw LAS file “Flaugergues_test_demo_ppk_L1.las” colored by elevation.

E:\LAStools\bin>lasinfo Flaugergues_test_demo_ppk_L1.las
lasinfo (171011) report for Flaugergues_test_demo_ppk_L1.las
reporting all LAS header entries:
 file signature: 'LASF'
 file source ID: 1
 global_encoding: 1
 project ID GUID data 1-4: 00000000-0000-0000-0000-000000000000
 version major.minor: 1.2
 system identifier: 'YellowScan Surveyor'
 generating software: 'YellowReader by YellowScan'
 file creation day/year: 178/2017
 header size: 227
 offset to point data: 297
 number var. length records: 1
 point data format: 3
 point data record length: 34
 number of point records: 4308932
 number of points by return: 4142444 166488 0 0 0
 scale factor x y z: 0.001 0.001 0.001
 offset x y z: 774282 6279505 92
 min x y z: 774152.637 6279377.623 82.673
 max x y z: 774408.344 6279541.646 116.656
variable length header record 1 of 1:
 reserved 0
 user ID 'LASF_Projection'
 record ID 34735
 length after header 16
 description ''
 GeoKeyDirectoryTag version 1.1.0 number of keys 1
 key 3072 tiff_tag_location 0 count 1 value_offset 2154 - ProjectedCSTypeGeoKey: RGF93 / Lambert-93
reporting minimum and maximum for all LAS point record entries ...
 X -129363 126344
 Y -127377 36646
 Z -9327 24656
 intensity 0 65278
 return_number 1 2
 number_of_returns 1 2
 edge_of_flight_line 0 0
 scan_direction_flag 0 0
 classification 0 0
 scan_angle_rank -120 120
 user_data 75 105
 point_source_ID 1 1
 gps_time 219873.160527 219908.550379
 Color R 0 0
 G 0 0
 B 0 0
number of first returns: 4142444
number of intermediate returns: 0
number of last returns: 4142444
number of single returns: 3975956
overview over number of returns of given pulse: 3975956 332976 0 0 0 0 0
histogram of classification of points:
 4308932 never classified (0)

Nicely visible are the circular scanning patterns of the Velodyne VLP-16 Puck. We also notice that the trajectory of the UAV can be seen in the lasview visualization because the Puck was scanning the drone’s own landing gear. The lasinfo report tells us that point coordinates are stored with too much resolution (mm) and that points do not need to be stored using point type 3 (with RGB colors) because all RGB values are zero. We fix this with an initial run of las2las and also compress the raw strips to the LAZ format on 4 CPUs in parallel.

las2las -i Flaugergues\*.las ^
        -rescale 0.01 0.01 0.01 ^
        -auto_reoffset ^
        -set_point_type 1 ^
        -odir Flaugergues\strips_raw -olaz ^
        -cores 4

Next we do the usual check for flightline alignment with lasoverlap (README) which we consider to be by far the most important quality check. We compare the lowest elevation from different flightline per 25 cm by 25cm cell in all overlap areas. We consider a vertical difference of up to 5 cm as acceptable (color coded as white) and mark differences of over 30 cm (color coded as saturated red or blue).

lasoverlap -i Flaugergues\strips_raw\*.laz -faf ^
           -step 0.25 ^
           -min_diff 0.05 -max_diff 0.3 ^
           -odir Flaugergues\quality -o overlap.png

The vertical difference in open areas between the flightlines is slightly above 5 cm which we consider acceptable in this example. Depending on the application we recommend to investigate further where these differences come from and what consequences they may have for post processing. We also create a color-coded visualization of the last return density per 25 cm by 25 cm cell using lasgrid (README) with blue meaning less than 100 returns per square meter and red meaning more than 4000 returns per square meter.

lasgrid -i Flaugergues\strips_raw\*.laz -merged ^
        -keep_last ^
        -step 0.25 ^
        -point_density ^
        -false -set_min_max 100 4000 ^
        -odir Flaugergues\quality -o density_100_4000.png

Color coded density of last returns per square meter for each 25 cm by 25 cm cell. Blue means 100 or less last returns per square meter. Red means 4000 or more last returns per square meter

As usual we start the LiDAR processing by reorganizing the flightlines into square tiles. Because of the variability in the density that is evident in the visualization above we use lastile (README) to create an adaptive tiling that starts with 200 m by 200 m tiles and then iterate to refine those tiles with over 10 million points down to smaller 25 m by 25 m tiles.

lastile -i Flaugergues\strips_raw\*.laz ^
        -apply_file_source_ID ^
        -tile_size 200 -buffer 8 -flag_as_withheld ^
        -refine_tiling 10000000 ^
        -odir Flaugergues\tiles_raw -o flauge.laz

lastile -i Flaugergues\tiles_raw\flauge*_200.laz ^
        -refine_tiles 10000000 ^
        -olaz ^
        -cores 4

lastile -i Flaugergues\tiles_raw\flauge*_100.laz ^
        -refine_tiles 10000000 ^
        -olaz ^
        -cores 4

lastile -i Flaugergues\tiles_raw\flauge*_50.laz ^
        -refine_tiles 10000000 ^
        -olaz ^
        -cores 4

Subsequent processing is faster when the points have a spatially coherent order. Therefore we rearrange the points into standard space-filling z-order using a call to lassort (README). We run this in parallel on as many cores as it makes sense (i.e. not using more cores than there are physical CPUs).

lassort -i Flaugergues\tiles_raw\flauge*.laz ^
        -odir Flaugergues\tiles_sorted -olaz ^
        -cores 4

Next we classify those points as noise that are isolated on a 3D grid of 1 meter cell size using lasnoise. See the README file of lasnoise for a description on the exact manner in which the isolated points are classified. We do this to eliminate low noise points that would otherwise cause trouble in the subsequent processing.

lasnoise -i Flaugergues\tiles_sorted\flauge*.laz ^
         -step 1 -isolated 5 ^
         -odir Flaugergues\tiles_denoised -olaz ^
         -cores 4

Next we mark the subset of lowest points on a 2D grid of 10 cm cell size with classification code 8 using lasthin (README) while ignoring the noise points with classification code 7 that were marked as noise in the previous step.

lasthin -i Flaugergues\tiles_denoised\flauge*.laz ^
        -ignore_class 7 ^
        -step 0.1 -lowest ^
        -classify_as 8 ^
        -odir Flaugergues\tiles_lowest -olaz ^
        -cores 4

Considering only the resulting points marked with classification 8 we then create a temporary ground classification that we refer to as the “lowest ground”. For this we run lasground (README) with a set of suitable parameters that were found by experimentation on two of the most complex tiles from the center of the survey.

lasground -i Flaugergues\tiles_lowest\flauge*.laz ^
          -ignore_class 0 7 ^
          -step 5 -hyper_fine -bulge 1.5 -spike 0.5 ^
          -odir Flaugergues\tiles_lowest_ground -olaz ^
          -cores 4

We then “thicken” this “lowest ground” by classifying all points that are between 2 cm below and 15 cm above the lowest ground to a temporary classification code 6 using the lasheight (README) tool. Depending on the spread of points in your data set you may want to tighten this range accordingly, for example when processing the flightlines acquired by the Velodyne Puck individually. We picked our range based on the visual experiments with “drop lines” and “rise lines” in the lasview viewer that are shown in images above.

lasheight -i Flaugergues\tiles_lowest_ground\flauge*.laz ^
          -do_not_store_in_user_data ^
          -classify_between -0.02 0.15 6 ^
          -odir Flaugergues\tiles_thick_ground -olaz ^
          -cores 4

The final ground classification is obtained by creating the “median ground” from the “thick ground”. This uses a brand-new option in the lasthin (README) tool of LAStools. The new ‘-percentile 50 10’ option selects the point that is closest to the specified percentile of 50 of all point elevations within a grid cell of a specified size given there are at least 10 points in that cell. The selected point either survives the thinning operation or gets marked with a specified classification code or flag.

lasthin -i Flaugergues\tiles_thick_ground\flauge*.laz ^
        -ignore_class 0 1 7 ^
        -step 0.1 -percentile 50 10 ^
        -classify_as 8 ^
        -odir Flaugergues\tiles_median_ground_10_10cm -olaz ^
        -cores 4

lasthin -i Flaugergues\tiles_median_ground_10_10cm\%NAME%*.laz ^
        -ignore_class 0 1 7 ^
        -step 0.2 -percentile 50 10 ^
        -classify_as 8 ^
        -odir Flaugergues\tiles_median_ground_10_20cm -olaz ^
        -cores 4

lasthin -i Flaugergues\tiles_median_ground_10_20cm\%NAME%*.laz ^
        -ignore_class 0 1 7 ^
        -step 0.4 -percentile 50 10 ^
        -classify_as 8 ^
        -odir Flaugergues\tiles_median_ground_10_40cm -olaz ^
        -cores 4

lasthin -i Flaugergues\tiles_median_ground_10_40cm\flauge*.laz ^
        -ignore_class 0 1 7 ^
        -step 0.8 -percentile 50 10 ^
        -classify_as 8 ^
        -odir Flaugergues\tiles_median_ground_10_80cm -olaz ^
         -cores 4

We now compare a triangulation of the median ground points with a triangulation of the highest and the lowest points per 10 cm by 10 cm cell to demonstrate that – at least in open areas – we really have computed a median ground surface.

Finally we raster the tiles with the las2dem (README) tool onto binary elevation grids in BIL format. Here we make the resolution dependent on the tile size, giving the 25 meter and 50 meter tiles the highest resolution of 10 cm and rasterize the 100 meter and 200 meter tiles at 20 cm and 40 cm respectively.

las2dem -i Flaugergues\tiles_median_ground_10_80cm\*_25.laz ^
        -i Flaugergues\tiles_median_ground_10_80cm\*_50.laz ^
        -keep_class 8 ^
        -step 0.1 -use_tile_bb ^
        -odir Flaugergues\tiles_dtm -obil ^
        -cores 4

las2dem -i Flaugergues\tiles_median_ground_10_80cm\*_100.laz ^
        -keep_class 8 ^
        -step 0.2 -use_tile_bb ^
        -odir Flaugergues\tiles_dtm -obil ^
        -cores 4

las2dem -i Flaugergues\tiles_median_ground_10_80cm\*_200.laz ^
        -keep_class 8 ^
        -step 0.4 -use_tile_bb ^
        -odir Flaugergues\tiles_dtm -obil ^
        -cores 4

Because all LAStools can read BIL files via on the fly conversion from rasters to points we can visually inspect the resulting elevation rasters with the lasview (README) tool. By adding the ‘-faf’ or ‘files_are_flightlines’ argument we treat the BIL files as if they were different flightlines which allows us to assign different color to points from different files to better inspect the transitions between tiles. The ‘-points 10000000’ argument instructs lasview to load up to 10 million points into memory instead of the default 5 million.

lasview -i Flaugergues\tiles_dtm\*.bil -faf ^
        -points 10000000

Final raster tiles in BIL format of three different sizes form seamless DTM.

For visual comparison we also produce a DSM and create hillshades. Note that the workflow for DSM creation shown below produces a “highest DSM” that will always be a few centimeter above the “median DTM”. This will be noticeable only in open areas of the terrain where the DSM and the DTM should coincide and their elevation should be identical.

lasthin -i Flaugergues\tiles_denoised\flauge*.laz ^
        -keep_z_above 110 ^
        -filtered_transform ^
        -set_classification 18 ^
        -ignore_class 7 18 ^
        -step 0.1 -highest ^
        -classify_as 5 ^
        -odir Flaugergues\tiles_highest -olaz ^
        -cores 4

las2dem -i Flaugergues\tiles_highest\*_25.laz ^
        -i Flaugergues\tiles_highest\*_50.laz ^
        -keep_class 5 ^
        -step 0.1 -use_tile_bb ^
        -odir Flaugergues\tiles_dsm -obil ^
        -cores 4

las2dem -i Flaugergues\tiles_highest\*_100.laz ^
        -keep_class 5 ^
        -step 0.2 -use_tile_bb ^
        -odir Flaugergues\tiles_dsm -obil ^
        -cores 4

las2dem -i Flaugergues\tiles_highest\*_200.laz ^
        -keep_class 5 ^
        -step 0.4 -use_tile_bb ^
        -odir Flaugergues\tiles_dsm -obil ^
        -cores 4

We thank YellowScan for challenging us to process their drone LiDAR with LAStools in order to present results at their LiDAR for Drone 2017 Conference and for sharing several example data sets with us, including the one used here.

LAStools Win Big at INTERGEO Taking Home Two Innovation Awards

PRESS RELEASE (for immediate release)
October 2, 2017
rapidlasso GmbH, Gilching, Germany

At INTERGEO 2017 in Berlin, rapidlasso GmbH – the makers of the popular LiDAR processing software LAStools – were awarded top honors in both of the categories they had been nominated for: most innovative software and most innovative startup. The third award for most innovative hardware went to Leica Geosystems for the BLK360 terrestrial scanner. The annual Wichman Innovation Awards have been part of INTERGEO for six years now. Already at the inaugural event in 2012 the open source LiDAR compressor LASzip of rapidlasso GmbH had been nominated, coming in as runner-up in second place.

Dr. Martin Isenburg, the founder and CEO of rapidlasso GmbH, receives the two innovation awards at the ceremony during INTERGEO 2017 in Berlin.

After receiving the two awards Dr. Martin Isenburg, the founder and CEO of rapidlasso GmbH, was quick to thank the “fun, active, and dedicated user community” of the LAStools software for their “incredible support in the online voting”. He pointed out that it was its users who make LAStools more than just an efficient software for processing point clouds. Since 2011, the community surrounding LAStools has constantly grown to several thousand users who help and motivate each other in designing workflows and in solving format issues and processing challenges. They are an integral part of what makes these tools so valuable, so Dr. Isenburg.

About rapidlasso GmbH:
Technology powerhouse rapidlasso GmbH specializes in efficient LiDAR processing tools that are widely known for their high productivity. They combine robust algorithms with efficient I/O and clever memory management to achieve high throughput for data sets containing billions of points. The company’s flagship product – the LAStools software suite – has deep market penetration and is heavily used in industry, government agencies, research labs, and educational institutions. Visit http://rapidlasso.com for more information.

LASmoons: Huaibo Mu

Huaibo Mu (recipient of three LASmoons)
Environmental Mapping, Department of Geography
University College London (UCL), UK

Background:
This study is a part of the EU-funded Metrology for Earth Observation and Climate project (MetEOC-2). It aims to combine terrestrial and airborne LiDAR data to estimate biomass and allometry for woodland trees in the UK. Airborne LiDAR can capture large amounts of data over large areas, while terrestrial LiDAR can provide much more details of high quality in specific areas. The biomass and allometry for individual specific tree species in 1 ha of Wytham Woods located about 5km north west of the University of Oxford, UK are estimated by combining both airborne and terrestrial LiDAR. Then the bias will be evaluated when estimation are applied on different levels: terrestrial LiDAR level, tree level, and area level. The goal are better insights and a controllable error range in the bias of biomass and allometry estimates for woodland trees based on airborne LiDAR.

Goal:
The study aims to find the suitable parameters of allometric equations for different specific species and establish the relationship between the tree height and stem diameter and crown diameter to be able to estimate the above ground biomass using airborne LiDAR. The biomass estimates under different levels are then compared to evaluate the bias and for the total 6ha of Wytham Woods for calibration and validation. Finally the results are to be applied to other woodlands in the UK. The LiDAR processing tasks for which LAStools are used mainly center around the creation of suitable a Canopy Height Model (CHM) from the airborne LiDAR.

Data:
+ Airborne LiDAR data produced by Professor David Coomes (University of Cambridge) with Airborne Research and Survey Facility (ARSF) Project code of RG13_08 in June 2014. The average point density is about 5.886 per m^2.
+ Terrestrial LiDAR data collected by UCL’s team leader by Dr. Mat Disney and Dr. Kim Calders in order to develop very detailed 3D models of the trees.
+ Fieldwork from the project “Initial Results from Establishment of a Long-term Broadleaf Monitoring Plot at Wytham Woods, Oxford, UK” by Butt et al. (2009).

LAStools processing:
1) check LiDAR quality as described in these videos and articles [lasinfo, lasvalidate, lasoverlap, lasgrid, las2dem]
2) classify into ground and non-ground points using tile-based processing  [lastile, lasground]
3) generate a Digital Terrain Model (DTM) [las2dem]
4) compute height of points and delete points higher than maximum tree height obtained from terrestrial LiDAR [lasheight]
5) convert points into disks with 10 cm diameter to conservatively account for laser beam width [lasthin]
6) generate spike-free Digital Surface Model (DSM) based on algorithm by Khosravipour et al. (2016) [las2dem]
7) create Canopy Height Model (CHM) by subtracting DTM from spike-free DSM [lasheight].

References:
Butt, N., Campbell, G., Malhi, Y., Morecroft, M., Fenn, K., & Thomas, M. (2009). Initial results from establishment of a long-term broadleaf monitoring plot at Wytham Woods, Oxford, UK. University Oxford, Oxford, UK, Rep.
Khosravipour, A., Skidmore, A.K., Isenburg, M., Wang, T.J., Hussin, Y.A., (2014). Generating pit-free Canopy Height Models from Airborne LiDAR. PE&RS = Photogrammetric Engineering and Remote Sensing 80, 863-872.
Khosravipour, A., Skidmore, A.K., Isenburg, M. and Wang, T.J. (2015) Development of an algorithm to generate pit-free Digital Surface Models from LiDAR, Proceedings of SilviLaser 2015, pp. 247-249, September 2015.
Khosravipour, A., Skidmore, A.K., Isenburg, M (2016) Generating spike-free Digital Surface Models using raw LiDAR point clouds: a new approach for forestry applications, (journal manuscript under review).

LASmoons: Marzena Wicht

Marzena Wicht (recipient of three LASmoons)
Department of Photogrammetry, Remote Sensing and GIS
Warsaw University of Technology, Poland.

Background:
More than half of human population (Heilig 2012) suffers from many negative effects of living in cities: increased air pollution, limited access to the green areas, Urban Heat Island (UHI) and many more. To mitigate some of these effects, many ideas came up over the years: reducing the surface albedo, the idea of the Garden City, green belts, and so on. Increasing horizontal wind speed might actually improve both, the air pollution dispersion and the thermal comfort in urban areas (Gál & Unger 2009). Areas of low roughness promote air flow – discharging the city from warm, polluted air and supplying it with cool and fresh air – if they share specific parameters, are connected and penetrate the inner city with a country breeze. That is why mapping low roughness urban areas is important in better understanding urban climate.

Goal:
The goal of this study is to derive buildings (outlines and height) and high vegetation using LAStools and to use that data in mapping urban ventilation corridors for our case study area in Warsaw. There are many ways to map these; however using ALS data has certain advantages (Suder& Szymanowski 2014) in this case: DSMs can be easily derived, tree canopy (incl. height) can be joined to the analysis and buildings can be easily extracted. The outputs are then used as a basis for morphological analysis, like calculating frontal area index. LAStools has the considerable advantage of processing large quantities of data (~500 GB) efficiently.

Frontal area index calculation based on 3D building database

Data:
+ LiDAR provided by Central Documentation Center of Geodesy and Cartography
+ average pulse density 12 p/m^2
+ covers 517 km^2 (whole Warsaw)

LAStools processing:
1) quality checking of the data as described in several videos and blog posts [lasinfo, lasvalidate, lasoverlap, lasgrid, lasduplicate, lasreturnlas2dem]
2) reorganize data into sufficiently small tiles with buffers to avoid edge artifacts [lastile]
3) classify point clouds into vegetation and buildings [lasground, lasclassify]
4) normalize LiDAR heights [lasheight]
5) create triangulated, rasterized derivatives: DSM / DTM / nDSM / CHM [las2dem, blast2dem]
6) compute height-based metrics (e.g. ‘-avg’, ‘-std’, and ‘-p 50’) [lascanopy]
7) generate subsets during the workflow [lasclip]
8) generate building footprints [lasboundary]

References:
Heilig, G. K. (2012). World urbanization prospects: the 2011 revision. United Nations, Department of Economic and Social Affairs (DESA), Population Division, Population Estimates and Projections Section, New York.
Gal, T., & Unger, J. (2009). Detection of ventilation paths using high-resolution roughness parameter mapping in a large urban area. Building and Environment, 44(1), 198-206.
Suder, A., & Szymanowski, M. (2014). Determination of ventilation channels in urban area: A case study of Wroclaw (Poland). Pure and Applied Geophysics, 171(6), 965-975.

LASmoons: Gudrun Norstedt

Gudrun Norstedt (recipient of three LASmoons)
Forest History, Department of Forest Ecology and Management
Swedish University of Agricultural Sciences, Umeå, Sweden

Background:
Until the end of the 17th century, the vast boreal forests of the interior of northern Sweden were exclusively populated by the indigenous Sami. When settlers of Swedish and Finnish ethnicity started to move into the area, colonization was fast. Although there is still a prospering reindeer herding Sami culture in northern Sweden, the old Sami culture that dominated the boreal forest for centuries or even millenia is to a large extent forgotten.
Since each forest Sami family formerly had a number of seasonal settlements, the density of settlements must have been high. However, only very few remains are known today. In the field, old Sami settlements can be recognized through the presence of for example stone hearths, storage caches, pits for roasting pine bark, foundations of certain types of huts, reindeer pens, and fences. Researchers of the Forest History section of the Department of Forest Ecology and Management have long been surveying such remains on foot. This, however, is extremely time consuming and can only be done in limited areas. Also, the use of aerial photographs is usually difficult due to dense vegetation. Data from airborne laser scanning should be the best way to find remains of the old forest Sami culture. Previous research has shown the possibilities of using airborne laser scanning data for detecting cultural remains in the boreal forest (Jansson et al., 2009; Koivisto & Laulamaa, 2012; Risbøl et al., 2013), but no studies have aimed at detecting remains of the forest Sami culture. I want to test the possibilities of ALS in this respect.

DTM from the Krycklan catchment, showing a row of hunting pits and (larger) a tar pit.

Goal:
The goal of my study is to test the potential of using LiDAR data for detecting cultural and archaeological remains on the ground in a forest area where Sami have been known to dwell during historical times. Since the whole of Sweden is currently being scanned by the National Land Survey, this data will be included. However, the average point density of the national data is only 0,5–1 pulses/m^2. Therefore, the study will be done in an established research area, the Krycklan catchment, where a denser scanning was performed in 2015. The Krycklan data set lacks ground point classification, so I will have to perform such a classification before I can proceed to the creation of a DTM. Having tested various kind of software, I have found that LAStools seems to be the most efficient way to do the job. This, in turn, has made me aware of the importance of choosing the right methods and parameters for doing a classification that is suitable for archaeological purposes.

Data:
The data was acquired with a multi-spectral airborne LiDAR sensor, the Optech Titan, and a Micro IRS IMU, operated on an aircraft flying at a height of about 1000 m and positioning was post-processed with the TerraPos software for higher accuracy.
The average pulse density is 20 pulse/m^2.
+ About 7 000 hectares were covered by the scanning. The data is stored in 489 tiles.

LAStools processing:
1) run a series of classifications of a few selected tiles with both lasground and lasground_new with various parameters [lasground and lasground_new]
2) test the outcomes by comparing it to known terrain to find out the optimal parameters for classifying this particular LiDAR point cloud for archaeological purposes.
3) extract the bare-earth of all tiles (using buffers!!!) with the best parameters [lasground or lasground_new]
4) create bare-earth terrain rasters (DTMs) and analyze the area [lasdem]
5) reclassify the airborne LiDAR data collected by the National Land Survey using various parameters to see whether it can become more suitable for revealing Sami cultural remains in a boreal forest landscape  [lasground or lasground_new]

References:
Jansson, J., Alexander, B. & Söderman, U. 2009. Laserskanning från flyg och fornlämningar i skog. Länsstyrelsen Dalarna (PDF).
Koivisto, S. & Laulamaa, V. 2012. Pistepilvessä – Metsien arkeologiset kohteet LiDAR-ilmalaserkeilausaineistoissa. Arkeologipäivät 2012 (PDF).
Risbøl, O., Bollandsås, O.M., Nesbakken, A., Ørka, H.O., Næsset, E., Gobakken, T. 2013. Interpreting cultural remains in airborne laser scanning generated digital terrain models: effects of size and shape on detection success rates. Journal of Archaeological Science 40:4688–4700.

LASmoons: Muriel Lavy

Muriel Lavy (recipient of three LASmoons)
RED (Risk Evaluation Dashboard) project
ISE-Net s.r.l, Aosta, ITALY.

Background:
The Aosta Valley Region is a mountainous area in the heart of the Alps. This region is regularly affected by hazard natural phenomena connected with the terrain geomorphometry and the climate change: snow avalanche, rockfalls and landslide.
In July 2016 a research program, funded by the European Program for the Regional Development, aims to create a cloud dashboard for the monitoring, the control and the analysis of several parameters and data derived from advanced sensors: multiparametrical probes, aerial and oblique photogrammetry and laser scanning. This tool will help the territory management agencies to improve the risk mitigation and management system.

The RIEGL VZ-4000 scanning the Aosta Valley Region in Italy.

Goal:
This study aims to classify the point clouds derived from aerial imagery integrated with laser scanning data in order to generate accurate DTM, DSM and Digital Snow Models. The photogrammetry data set was acquired with a Nikon D810 camera from an helicopter survey. The aim of further analysis is to detect changes of natural dynamic phenomena that have occurred via volume analysis and mass balance evaluation.

Data:
+ The photogrammetry data set was acquired with an RGB camera (Nikon D810) with a focal length equivalent of 50 mm from a helicopter survey: 1060 JPG images
+ The laser scanner data set was acquired using a Terrestrial Laser Scanner (RIEGL VZ-4000) combined with a Leica GNSS device (GS25) to georeference the project. The TLS dataset was then used as base reference to properly align and georeference the photogrammetry point cloud.

LAStools processing:
1) check the reference system and the point cloud density [lasinfo, lasvalidate]
2) remove isolated noise points [lasnoise]
3) classify point into ground and non-ground [lasground]
4) classify point clouds into vegetation and other [lasclassify]
5) create DTM and DSM  [las2dem, lasgrid, blast2dem]
6) produce 3D visualizations to facilitate the communication and the interaction [lasview]

Plots to Stands: Producing LiDAR Vegetation Metrics for Imputation Calculations

Some professionals in remote sensing find LAStools a useful tool to extract statistical metrics from LiDAR that are used to make estimations about a larger area of land from a small set of sample plots. Common applications are prediction of the timber volume or the above-ground biomass for entire forests based on a number of representative plots where exact measurements were obtained with field work. The same technique can also be used to make estimations about animal habitat or coconut yield or to classify the type of vegetation that covers the land. In this tutorial we describe the typical workflow for computing common metrics for smaller plots and larger areas using LAStools.

Download these six LiDAR tiles (1, 2, 3, 4, 5, 6) from a Eucalyptus plantation in Brazil to follow along the step by step instructions of this tutorial. This data is courtesy of Suzano Pulp and Paper. Please also download the two shapefiles that delineate the plots where field measurements were taken and the stands for which predictions are to be made. You should download version 170327 (or higher) of LAStools due to some recent bug fixes.

Quality Checking

Before processing newly received LiDAR data we always perform a quality check first. This ranges from visual inspection with lasview, to printing textual content reports and attribute histograms with lasinfo, to flight-line alignment checks with lasoverlap, pulse density and pulse spacing checks with lasgrid and las2dem, and completeness-of-returns check with lassort followed by lasreturn.

lasinfo -i tiles_raw\CODL0003-C0006.laz ^
        -odir quality -odix _info -otxt

The lasinfo report tells us that there is no projection information. However, we remember that this Brazilian data was in the common SIRGAS 2000 projection and try for a few likely UTM zones whether the hillshaded DSM produced by las2dem falls onto the right spot in Google Earth.

las2dem -i tiles_raw\CODL0003-C0006.laz ^
        -keep_first -thin_with_grid 1 ^
        -hillshade -epsg 31983 ^
        -o epsg_check.png

Hillshaded DSM and Google Earth imagery align for EPSG code 31983

The lasinfo report also tells us that the xyz coordinates are stored with millimeter resolution which is a bit of an overkill. For higher and faster LASzip compression we will later lower this to a more appropriate centimeter resolution. It further tells us that the returns are stored using point type 0 and that is a bit unfortunate. This (older) point type does not have a GPS time stamp so that some quality checks (e.g. “completeness of returns” with lasreturn) and operations (e.g. “resorting of returns into acquisition order” with lassort) will not be possible. Fortunately the min-max range of the ‘point source ID’ suggests that this point attribute is correctly populated with flightline numbers so that we can do a check for overlap and alignment of the different flightlines that contribute to the LiDAR in each tile.

lasoverlap -i tiles_raw\*.laz ^
           -min_diff 0.2 -max_diff 0.4 ^
           -epsg 31983 ^
           -odir quality -opng ^
           -cores 3

We run lasoverlap to visualize the amount of overlap between flightlines and the vertical differences between them. The produced images (see below) color code the number of flightlines and the maximum vertical difference between any two flightlines as seen below. Most of the area is cyan (2 flightlines) except in the bottom left where the pilot was sloppy and left some gaps in the yellow seams (3 flightlines) so that some spots are only blue (1 flightline). We also see that two tiles in the upper left are partly covered by a diagonal flightline. We will drop that flightline later to create a more uniform density.across the tiles. The mostly blue areas in the difference are mostly aligned with features in the landscape and less with the flightline pattern. Closer inspection shows that these vertical difference occur mainly in the dense old growth forests with species of different heights that are much harder to penetrate by the laser than the uniform and short-lived Eucalyptus plantation that is more of a “dead forest” with little undergrowth or animal habitat.

Interesting observation: The vertical difference of the lowest return from different flightlines computed per 2 meter by 2 meter grid cell could maybe be used a new forestry metric to help distinguish mono cultures from natural forests.

lasgrid -i tiles_raw\*.laz ^
        -keep_last ^
        -step 2 -point_density ^
        -false -set_min_max 10 20 ^
        -epsg 31983 ^
        -odir quality -odix _d_2m_10_20 -opng ^
        -cores 3

lasgrid -i tiles_raw\*.laz ^
        -keep_last ^
        -step 5 -point_density ^
        -false -set_min_max 10 20 ^
        -epsg 31983 ^
        -odir quality -odix _d_5m_10_20 -opng ^
        -cores 3

We run lasgrid to visualize the pulse density per 2 by 2 meter cell and per 5 by 5 meter cell. The produced images (see below) color code the number of last return per square meter. The impact of the tall Eucalyptus trees on the density per cell computation is evident for the smaller 2 meter cell size in form of a noisy blue/red diagonal in the top right as well as a noisy blue/red area in the bottom left. Both of those turn to a more consistent yellow for the density per cell computation with larger 5 meter cells. Immediately evident is the higher density (red) for those parts or the two tiles in the upper left that are covered by the additional diagonal flightline. The blueish area left to the center of the image suggests a consistently lower pulse density whose cause remains to be investigated: Lower reflectivity? Missing last returns? Cloud cover?

The lasinfo report suggests that the tiles are already classified. We could either use the ground classification provided by the vendor or re-classify the data ourselves (using lastilelasnoise, and lasground). We check the quality of the ground classification by visually inspecting a hillshaded DTM created with las2dem from the ground returns. We buffer the tiles on-the-fly for a seamless hillshade without artifacts along tile boundaries by adding ‘-buffered 25’ and ‘-use_orig_bb’ to the command-line. To speed up reading the 25 meter buffers from neighboring tiles we first create a spatial indexing with lasindex.

lasindex -i tiles_raw\*.laz ^
         -cores 3

las2dem -i tiles_raw\*.laz ^
        -buffered 25 ^
        -keep_class 2 -thin_with_grid 0.5 ^
        -use_orig_bb ^
        -hillshade -epsg 31983 ^
        -odir quality -odix _dtm -opng ^
        -cores 3

hillshaded DTM tiles generated with las2dem and on-the-fly buffering

The resulting hillshaded DTM shows a few minor issues in the ground classification but also a big bump (above the mouse cursor). Closer inspection of this area (you can cut it from the larger tile using the command-line below) shows that there is a group of miss-classified points about 1200 meters below the terrain. Hence, we will start from scratch to prepare the data for the extraction of forestry metrics.

las2las -i tiles_raw\CODL0004-C0006.laz ^
        -inside_tile 207900 7358350 100 ^
        -o bump.laz

lasview -i bump.laz

bump in hillshaded DTM caused by miss-classified low noise

Data Preparation

Using lastile we first tile the data into smaller 500 meter by 500 meter tiles with 25 meter buffer while flagging all points in the buffer as ‘withheld’. In the same step we lower the resolution to centimeter and put nicer a coordinate offset in the LAS header. We also remove the existing classification and classify all points that are much lower than the target terrain as class 7 (aka noise). We also add CRS information and give each tile the base name ‘suzana.laz’.

lastile -i tiles_raw\*.laz ^
        -rescale 0.01 0.01 0.01 ^
        -auto_reoffset ^
        -set_classification 0 ^
        -classify_z_below_as 500.0 7 ^
        -tile_size 500 ^
        -buffer 25 -flag_as_withheld ^
        -epsg 31983 ^
        -odir tiles_buffered -o suzana.laz

With lasnoise we mark the many isolated points that are high above or below the terrain as class 7 (aka noise). Using two tiles we played around with the ‘step’ parameters until we find good parameter settings. See the README of lasnoise for the exact meaning and the choice of parameters for noise classification. We look at one of the resulting tiles with lasview.

lasnoise -i tiles_buffered\*.laz ^
         -step_xy 4 -step_z 2 ^
         -odir tiles_denoised -olaz ^
         -cores 3

lasview -i tiles_denoised\suzana_206000_7357000.laz ^
        -color_by_classification ^
        -win 1024 192

noise points shown in pink: all points (top), only noise points (bottom)

Next we use lasground to classify the last returns into ground (2) and non-ground (1). It is important to ignore the noise points with classification 7 to avoid the kind of bump we saw in the vendor-delivered classification. We again check the quality of the computed ground classification by producing a hillshaded DTM with las2dem. Here the las2dem command-line is sightly different as instead of buffering on-the-fly we use the buffers stored with each tile.

lasground -i tiles_denoised\*.laz ^
          -ignore_class 7 ^
          -nature -extra_fine ^
          -odir tiles_ground -olaz ^
          -cores 3

las2dem -i tiles_ground\*.laz ^
        -keep_class 2 -thin_with_grid 0.5 ^
        -hillshade ^
        -use_tile_bb ^
        -odir quality -odix _dtm_new -opng ^
        -cores 3

Finally, with lasheight we compute how high each return is above the triangulated surface of all ground returns and store this height value in place of the elevation value into the z coordinate using the ‘-replace_z’ switch. This height-normalizes the LiDAR in the sense that all ground returns are set to an elevation of 0 while all other returns get an elevation relative to the ground. The result are height-normalized LiDAR tiles that are ready for producing the desired forestry metrics.

lasheight -i tiles_ground\*.laz ^
          -replace_z ^
          -odir tiles_normalized -olaz ^
          -cores 3
Metric Production

The tool for computing the metrics for the entire area as well as for the individual field plots is lascanopy. Which metrics are well suited for your particular imputation calculation is your job to determine. Maybe first compute a large number of them and then eliminate the redundant ones. Do not use any point from the tile buffers for these calculations. We had flagged them as ‘withheld’ during the lastile operation, so they are easy to drop. We also want to drop the noise points that were classified as 7. And we were planning to drop that additional diagonal flightline we noticed during quality checking. We generated two lasinfo reports with the ‘-histo point_source 1’ option for the two tiles it was covering. From the two histograms it was easy to see that the diagonal flightline has the number 31.

First we run lascanopy on the 11 plots that you can download here. When running on plots it makes sense to first create a spatial indexing with lasindex for faster querying so that only those tiny parts of the LAZ file need to be loaded that actually cover the plots.

lasindex -i tiles_normalized\*.laz ^
         -cores 3

lascanopy -i tiles_normalized\*.laz -merged ^
          -drop_withheld ^
          -drop_class 7 ^
          -drop_point_source 31 ^
          -lop WKS_PLOTS.shp ^
          -cover_cutoff 3.0 ^
          -cov -dns ^
          -height_cutoff 2.0 ^
          -c 2.0 999.0 ^
          -max -avg -std -kur ^
          -p 25 50 75 95 ^
          -b 30 50 80 ^
          -d 2.0 5.0 10.0 50.0 ^
          -o plots.csv

The resulting ‘plots.csv’ file you can easily process in other software packages. It contains one line for each polygonal plot listed in the shapefile that lists its bounding box followed by all the requested metrics. But is why is there a zero maximum height (marked in orange) for plots 6 though 10? All height metrics are computed solely from returns that are higher than the ‘height_cutoff’ that was set to 2 meters. We added the ‘-c 2.0 999.0’ absolute count metric to keep track of the number of returns used in these calculations. Apparently in plots 6 though 10 there was not a single return above 2 meters as the count (also marked in orange) is zero for all these plots. Turns out this Eucalyptus stand had recently been harvested and the new seedlings are still shorter than 2 meters.

more plots.csv
index,min_x,min_y,max_x,max_y,max,avg,std,kur,p25,p50,p75,p95,b30,b50,b80,c00,d00,d01,d02,cov,dns
0,206260.500,7358289.909,206283.068,7358312.477,11.23,6.22,1.91,2.26,4.71,6.01,7.67,9.5,26.3,59.7,94.2,5359,18.9,41.3,1.5,76.3,60.0
1,206422.500,7357972.909,206445.068,7357995.477,13.54,7.5,2.54,1.97,5.32,7.34,9.65,11.62,26.9,54.6,92.2,7030,12.3,36.6,13.3,77.0,61.0
2,206579.501,7358125.909,206602.068,7358148.477,12.22,5.72,2.15,2.5,4.11,5.32,7.26,9.76,46.0,73.7,97.4,4901,24.8,28.7,2.0,66.8,51.2
3,206578.500,7358452.910,206601.068,7358475.477,12.21,5.68,2.23,2.64,4.01,5.14,7.18,10.04,48.3,74.1,95.5,4861,25.7,26.2,2.9,68.0,50.2
4,206734.501,7358604.910,206757.068,7358627.478,15.98,10.3,2.18,2.64,8.85,10.46,11.9,13.65,3.3,27.0,91.0,4946,0.6,32.5,44.5,91.0,77.5
5,207043.501,7358761.910,207066.068,7358784.478,15.76,10.78,2.32,3.43,9.27,11.03,12.49,14.11,3.2,20.7,83.3,4819,1.5,24.7,51.0,91.1,76.8
6,207677.192,7359630.526,207699.760,7359653.094,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.0,0.0,0.0,0,0.0,0.0,0.0,0.0,0.0
7,207519.291,7359372.366,207541.859,7359394.934,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.0,0.0,0.0,0,0.0,0.0,0.0,0.0,0.0
8,207786.742,7359255.850,207809.309,7359278.417,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.0,0.0,0.0,0,0.0,0.0,0.0,0.0,0.0
9,208159.017,7358997.344,208181.584,7359019.911,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.0,0.0,0.0,0,0.0,0.0,0.0,0.0,0.0
10,208370.909,7358602.565,208393.477,7358625.133,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.00,0.0,0.0,0.0,0,0.0,0.0,0.0,0.0,0.0

Then we run lascanopy on the entire area and produce one raster per tile for each metric. Here we remove the buffered points with the ‘-use_tile_bb’ switch that also ensures that the produced rasters have exactly the extend of the tiles without buffers. Again, it is imperative that you download the version 170327 (or higher) of LAStools for this to work correctly.

lascanopy -version
LAStools (by martin@rapidlasso.com) version 170327 (academic)

lascanopy -i tiles_normalized\*.laz ^
          -use_tile_bb ^
          -drop_class 7 ^
          -drop_point_source 31 ^
          -step 10 ^
          -cover_cutoff 3.0 ^
          -cov -dns ^
          -height_cutoff 2.0 ^
          -c 2.0 999.0 ^
          -max -avg -std -kur ^
          -p 25 50 75 95 ^
          -b 30 50 80 ^
          -d 2.0 5.0 10.0 50.0 ^
          -odir tile_metrics -oasc ^
          -cores 3

The resulting rasters in ASC format can easily be previewed using lasview for some “sanity checking” that our metrics make sense and to get a quick overview about what these metrics look like.

lasview -i tile_metrics\suzana_*max.asc
lasview -i tile_metrics\suzana_*p95.asc
lasview -i tile_metrics\suzana_*p50.asc
lasview -i tile_metrics\suzana_*p25.asc
lasview -i tile_metrics\suzana_*cov.asc
lasview -i tile_metrics\suzana_*d00.asc
lasview -i tile_metrics\suzana_*d01.asc
lasview -i tile_metrics\suzana_*b30.asc
lasview -i tile_metrics\suzana_*b80.asc

The maximum height rasters are useful to inspect more closely as they will immediately tell us if there was any high noise point that slipped through the cracks. And indeed it happened as we see a maximum of 388.55 meters for of the 10 by 10 meter cells. As we know the expected height of the trees we could have added a ‘-drop_z_above 70’ to the lascanopy command line. Careful, however, when computing forestry metrics in strongly sloped terrains as the terrain slope can significantly lift up returns to heights much higher than that of the tree. This is guaranteed to happen for LiDAR returns from branches that are extending horizontally far over the down-sloped part of the terrain as shown in this paper here.

We did not use the shapefile for the stands in this exercise. We could have clipped the normalized LiDAR points to these stands using lasclip as shown in the GUI below before generating the raster metrics. This would have saved space and computation time as many of the LiDAR points lie outside of the stands. However, it might be better to do that clipping step on the rasters in whichever GIS software or statistics package you are using for the imputation computation to properly account for partly covered raster cells along the stand boundary. This could be subject of another blog article … (-:

not all LiDAR was needed to compute metrics for