With pleasure we share this recently published article in LiDAR Magazine: “LAStools Rises from the Ashes“, by Dr. A. Stewart Walker
LAStools@INTERGEO
Development and perspective
Beside maintenance and customer support, LAStools will certainly need some updates and further development to remain competitive. We are currently working on the following tasks.
New GUI
LAStools is a fundamental element regarding LiDAR data processing, but the GUI is outdated. Young professionals require a more intuitive operation. Visualizing cloud points helps to understand about data and processing. We‘ve started to develop a new GUI wrapper. It‘s challenging, but certain parts are already running.

Linux
Compiling the open-source part of LAStools is quite easy and done by many users. The commercial part is more difficult to port due to various dependencies and the complexity of the code. Nevertheless, most of it is done. The binaries are too big to pack them into our regular releases. If you are interested in running the commandline version of LAStools in a native Linux environment, kindly contact us to join our Linux beta tester program.
Manuals
LAStools are basically commandline tools optimized for batch processing. The “–help” output and the *_README.txt files of each tool provide basic information. Additionally, there is a lot of information (forums, videos) that is not easy to find, especially for initial users. Our aim is to collect the scattered information and elaborate a user manual for each tool. Together with our tutorials, this results in a complete documentation of LAStools.
We look forward to your feedback and are always open for requests.
New “LidarBC” Portal adds plenty of Precise Points to Open LiDAR in Canada, eh?
Around 86,000 square kilometres of the province of British Columbia have been mapped using LiDAR and these high-tech point clouds are now released as open data. More about these big news can be found here. We decided to take the portal for a quick spin and try to find the Computer Science building of UBC Vancouver where we studied from 1996 to 1999. The experience was pleasant. Either go directly to the portal or visit the government page linking to it.
Then follow the steps shown in the images below to get to the data.
Use the “Layer List” menu to disable all overlays but LiDAR and crank down the transparency. Zoom in on an area of interest and make the layer more transparent again. Zoom in further until the map gives you enough detail to decide about the tile of interest. Click on the tile of interest and an information window pops up with a download link. Click to download.
A few moments later the LiDAR tile in compressed LAZ format is yours to play with as long as you adhere to the rules of the Open Government License – British Columbia. After I downloaded the tile of UBC campus that contains the Computer Science building, I ran a quick lasinfo report whose output is shown below.

Everything looks fine. Unfortunately the time stamps are in “GPS week time” instead of “Adjusted GPS Standard time” so that the per-point information about which week of which year a point was collected is missing. A minor thing is the “coordinate resolution fluff” in the z coordinate. This means that although the z scale factor is set to 0.001 for storing millimeter precise elevations, the actual z coordinates stored in the LAZ file were rounded to full centimeters, so that a scale factor of 0.01 would be more appropriate.
Using the GUI of las2las I then clipped out a 500 meter by 300 meter part containing the Computer Science building,
las2las ^
-i bc_092g024_4_2_1_xyes_8_utm10_20170714.laz ^
-inside 481800 5456375 482300 5456675 ^
-odix _cut -olaz
and visualized the resulting LAZ file using lasview with option ‘-spike_free 1.0’ which can then be used for spike-free TIN generation by pressing hotkey <SHIFT>+<y>.

On the left is the Computer Science building. By hovering over two points on either end of the building and pressing <SHIFT>+<i> each time, I can measure that it is about 110 meters long. On the right, we see the much more famous UBC Forestry building. Below the same data as a point cloud colored by return type, where yellow points are single returns, red points are first of many, blue points are last of many and green points are intermediate returns.

Previously, access to LiDAR in British Columbia had been prohibitively expensive for many users. At the same time this precious data had often been sitting idle for years on government computers, never mind that it was collected with tax dollars. Now it is available to small and big businesses, non-profit organizations, recreational associations, scientists, academics and hobbyist alike allowing maximal exploration and exploitation for many different purposes, even some currently still unforeseen ones. Kudos to the Government of British Columbia for finally taking this step.
LASmoons: Olumese Efeovbokhan
Olumese Efeovbokhan (recipient of another three LASmoons)
Geosciences, School of Geography
University of Nottingham, UK
Background:
Hydrological models require various input data for flood vulnerability mapping. An important input data for flood vulnerability mapping is the DTM over which flow is being routed. DTMs are generated using cartography, ground surveying, digital aerial photogrammetry, interferometric SAR (InSAR), LiDAR amongst other means. The accuracy of high resolution DTMs minimize errors that may emanate from input data when conducting hydrological modelling, especially in small built-up catchment areas. This research involves the application of digital aerial photogrammetry to generate point clouds which can subsequently be utilized for flood vulnerability mapping.
Goal:
To consolidate on previous gains in using LAStools to generate DTMs required for flood vulnerability mapping. The suitability of these DTMs will be subsequently validated for flood vulnerability analysis. These results will be compared with other DTMs in order to determine the uncertainty associated with the use of such DTMs for flood vulnerability mapping.
Data:
+ high-resolution photogrammetry point cloud and DSM for Lagos Island, Ikorodu and Ajah Nigeria
– – – imagery obtained with an Ebee Sensefly drone flight
– – – photogrammetry point cloud generated with Photoscan by AgiSoft
+ rainfall data
+ classified LiDAR point cloud with a resolution of 1 pulse per square meter obtained for the study area from the Lagos State Government
LAStools processing:
1) tile large photogrammetry point cloud into tiles with buffer [lastile]
2) mark set of points whose z coordinate is a certain percentile of that of their neighbors [lasthin]
3) remove isolated low points from the set of marked points [lasnoise]
4) classify marked points into ground and non-ground [lasground]
5) pull in points close above and below the ground [lasheight]
6) create Digital Terrain Model (DTM) from ground points [las2dem]
7) merge and hillshade individual raster DTMs [blast2dem]
Generating DTM for “fluffy” Livox MID-40 LiDAR via “median ground” points
In the last article about the Livox MID-40 LiDAR scan of Samara we quality checked the data, aligned the flight lines and cleaned the remaining spurious scan lines. In this article we will process this data into the standard products. A focus will be on generating a smooth “median ground” surface from the “fluffy” scanner data. You can get the flight lines here and follow along with the processing after downloading LAStools. Below is one result of this work.

The first processing step will be to tile the strips into tiles that contain fewer points for faster and also parallel processing. One quick “flat terrain” trick first. Often there are spurious points that are far above or below the terrain. For a relatively flat area these can be easily be identified by computing a histogram elevation values with lasinfo and then eliminated with simple drop-filters on the z coordinate.
lasinfo ^
-i Samara\Livox\03_strips_cleaned\*.laz ^
-merged ^
-histo z 1 ^
-odir Samara\Livox\01_quality ^
-o strips_cleaned_merged_info.txt
The relevant excerpts of the output of the lasinfo report are shown below:
[…]
z coordinate histogram with bin size 1.000000
bin -104 has 1
bin 5 has 1
bin 11 has 273762
bin 12 has 1387999
bin 13 has 5598767
bin 14 has 36100225
bin 15 has 53521371
[…]
bin 59 has 60308
bin 60 has 26313
bin 61 has 284
bin 65 has 10
bin 66 has 31
bin 67 has 12
bin 68 has 1
bin 83 has 3
bin 84 has 4
bin 93 has 31
bin 94 has 93
bin 95 has 17
[…]
The few points below 11 meters and above 61 meters in elevation can be considered spurious. In the initial tiling step with lastile we add simple elevation filters to drop those points on-the-fly from the buffered tiles. The importance of buffers when processing LiDAR in tiles is discussed in this article. With lastile we create tiles of size 125 meters with a buffer of 20 meters, while removing the points identified as spurious with the appropriate filters. Because the input strips have their “file source ID” in the LAS header correctly set, we use ‘-apply_file_source_ID’ to set the “point source ID” of every point to this value. This preserves the information of which point comes from which flight line.
lastile ^
-i Samara\Livox\03_strips_cleaned\*.laz ^
-drop_z_below 11 -drop_z_above 61 ^
-apply_file_source_ID ^
-tile_size 125 -buffer 20 ^
-odir Samara\Livox\05_tiles_buffered -o desman.laz
This produces 49 buffered tiles that will now be processed similarly to the workflow outlined for another lower-priced system that generates similarly “fluffy” point clouds on hard surfaces, the Velodyne HLD-32E, described here and here. What do we mean with “fluffy”? We cut out a 1 meter slice across the road with the new ‘-keep_profile’ filter and las2las and inspect it with lasview.
las2las ^
-i Samara\Livox\05_tiles_buffered\desman_331750_1093000.laz ^
-keep_profile 331790 1093071 331799 1093062 1 ^
-o slice.laz
lasview ^
-i slice.laz ^
-color_by_flightline ^
-kamera 0 274.922 43.7695 0.00195313 -0.0247396 1.94022 ^
-point_size 9
In the view below we pressed hot key twice ‘]’ to exaggerate the z scale. The “fuzziness” is that thickness of the point cloud in the middle of this flat tar road. It is around 20 to 25 centimeters and is equally evident in both flight lines. What is the correct ground surface through this 20 to 25 centimeter “thick” road? We will compute a “mean ground” that roughly falls into the middle of this “fluffy” surface,
The next three lasthin runs mark a sub set of low candidate points for our lasground filtering. In every 25 cm by 25 cm, every 33 cm by 33 cm and every 50 cm by 50 cm area we reclassify the point closest to the 10th percentile as class 8. In the first call to lasthin we put all other points into class 1.
lasthin ^
-i Samara\Livox\05_tiles_buffered\desman_*.laz ^
-set_classification 1 ^
-step 0.25 -percentile 10 20 -classify_as 8 ^
-odir Samara\Livox\06_tiles_thinned_01 -olaz ^
-cores 4lasthin ^
-i Samara\Livox\06_tiles_thinned_01\desman_*.laz ^
-step 0.3333 -percentile 10 20 -classify_as 8 ^
-odir Samara\Livox\06_tiles_thinned_02 -olaz ^
-cores 4lasthin ^
-i Samara\Livox\06_tiles_thinned_02\desman_*.laz ^
-step 0.5 -percentile 10 20 -classify_as 8 ^
-odir Samara\Livox\06_tiles_thinned_03 -olaz ^
-cores 4
Below you can see the resulting points of the 10th percentile classified as class 8 in red.


Operating only on the points classified as 8 (i.e. ignoring those classified as 1) we then run a ground classification with lasground using the following command line, which creates a “low ground” classification. .
lasground ^
-i Samara\Livox\06_tiles_thinned_03\desman_*.laz ^
-ignore_class 1 ^
-town -ultra_fine ^
-ground_class 2 ^
-odir Samara\Livox\07_tiles_ground_low -olaz ^
-cores 4
Since this is an open road this classifies most of the red points as ground points.


Using lasheight we then create a “thick ground” by pulling all those points into the ground surface that are between 5 centimeter below and 17 centimeter above the “low ground”. For visualization purposes we temporarily use class 6 to capture this thickened ground.
lasheight ^
-i Samara\Livox\07_tiles_ground_low\desman_*.laz ^
-classify_between -0.05 0.17 6 ^
-odir Samara\Livox\07_tiles_ground_thick -olaz ^
-cores 4
The “thick ground” is shown below in orange.

We go back to lasthin and reclassify in every 50 cm by 50 cm area the point closest to the 50th percentile as class 8. This is what we call the “median ground”.
lasthin ^
-i Samara\Livox\07_tiles_ground_thick\desman_*.laz ^
-ignore_class 1 ^
-step 0.5 -percentile 50 -classify_as 8 ^
-odir Samara\Livox\07_tiles_ground_median -olaz ^
-cores 4
The final “median ground” points are shown in red below. These are the points we will use to eventually compute the DTM.

We complete the fully automated classification available in LAStools by running lasclassify with the following options. See the README file for what these options mean. Note that we move the “thick ground” from the temporary class 6 to the proper class 2. The “median ground” continues to be in class 8.
lasclassify ^
-i Samara\Livox\07_tiles_ground_median\desman_*.laz ^
-change_classification_from_to 6 2^
-rugged 0.3 -ground_offset 1.5 ^
-odir Samara\Livox\08_tiles_classified -olaz ^
-cores 4
Before the resulting tiles are published or shared with others we should remove the temporary buffers, which is done with lastile – the same tool that created the buffers.
lastile ^
-i Samara\Livox\08_tiles_classified\desman_*.laz ^
-remove_buffer ^
-odir Samara\Livox\09_tiles_final -olaz ^
-cores 4
And then we can publish the points via a Potree 3D Webportal using laspublish.
laspublish ^
-i Samara\Livox\09_tiles_final\desman_
*.laz ^
-elevation ^
-title "Samara Mangroves" ^
-odir Samara\Livox\99_portal -o SamaraMangroves.html -olaz ^
-overwrite
Below a screenshot of the resulting Potree 3D Web portal rendered with Potree Desktop. Inspecting the classification will reveal a number of errors that could be tweaked manually with lasview. How the point colors were generated is not described here but I used Google satellite imagery and mapped it with lascolor to the points. The elevation colors are mapped from 14 meters to 25 meters. The intensity image may help us understand why the black tar road on the left hand side that runs from the “Las Palmeras Condos” to the beach in “Cangrejal” has no samples. It seems the intensity is lower on this side which indicates that the drone may have flown higher here – too high to for the road to reflect enough photons. The yellow view of return type indicates that despite it’s multi-return capability, the Livox MID-40 LiDAR is mostly collecting single returns.
The penetration capability of the Livox MID-40 LiDAR was less good than we had hoped for. Below thick vegetation we have too few points on the ground to give us a good digital terrain model. In the visualization below you can see that below the dense vegetation there are large black areas which are completely void of points.
Now we produce the standard product DTM and DSM at a resolution of 50 cm. Because the total area is not that big we generate temporary tiles in “raster LAZ” with las2dem and merge them into a single GeoTiff with blast2dem.
las2dem ^
-i Samara\Livox\07_tiles_ground_median\desman_*.laz ^
-keep_class 8 ^
-step 0.5 -use_tile_bb ^
-odir Samara\Livox\10_tiles_dtm_50cm -olaz ^
-cores 4
blast2dem ^
-i Samara\Livox\10_tiles_dtm_50cm*.laz -merged ^
-step 0.5 -hillshade ^
-o Samara\Livox\dtm_50cm.png
blast2dem ^
-i Samara\Livox\10_tiles_dtm_50cm*.laz -merged ^
-step 0.5 ^
-o Samara\Livox\dtm_50cm.tif
lasthin ^
-i Samara\Livox\07_tiles_ground_median\desman_*.laz ^
-step 0.5 -percentile 95 ^
-odir Samara\Livox\11_tiles_highest -olaz ^
-cores 4
las2dem ^
-i Samara\Livox\11_tiles_highest\desman_*.laz ^
-step 0.5 -use_tile_bb ^
-odir Samara\Livox\12_tiles_dsm_50cm -olaz ^
-cores 4
blast2dem ^
-i Samara\Livox\12_tiles_dsm_50cm*.laz -merged ^
-step 0.5 -hillshade ^
-o Samara\Livox\dsm_50cm.png
blast2dem ^
-i Samara\Livox\12_tiles_dsm_50cm*.laz -merged ^
-step 0.5 ^
-o Samara\Livox\dsm_50cm.tif
A big “Thank You!” to Nelson Mattie from LiDAR Latinoamerica for bringing his fancy drone to Samara and to Andre Jalobeanu from Bayesmap for his help in aligning the data. You can download the flight lines here and do the above processing on your own after downloading LAStools.

Preparing Drone LiDAR from Snoopy by LidarUSA carrying a Velodyne HDL-32E
In March 2019 I was welcoming Nelson Mattie from LiDAR Latinoamerica to Samara who brought along his versatile Snoopy A-Series scanning system by LidarUSA that is based on the Velodyne HDL-32E scanner. We mounted it to his truck for a mobile scan of the core downtown block, Nelson carried it on his shoulder through “Samara Jungle” for a pedestrian scan and we strapped it onto a DJI Matrice 600 to scan this cute beach and surf town from above.
The scanning part was easy. Getting sensible data out of the ScanLookPC software proved to be quite an Odyssee as I neither had access to nor training with the ScanLookPC software. Hard surfaces such as the roof of the “New China” supermarket looked this wobbly when looking at the points from individual beams of this 32 beam scanner and turned into a complete fuzz when using all points from all beams.
I could only scrutinize the LAS files and I found several unrelated errors, such as duplicate returns and non-unique GPS time stamps, but I was unable to fix the wobbles. Frustrated with the vendor support I was ready to give up when out-of-nowhere I suddenly got an email from Luis Hernandez Perez who also worked with these system. He sent me a link to properly exported flight strips and suggested “it was a problem of poor GNSS signal and the solution for me was use the base station IND1 (of IGS) that was 3.2 kilometers away”. After months of struggles I finally had LiDAR data from downtown Samara and look how crisp the roof of the supermarket suddenly was.

As always my standard quality checks are running lasview, lasinfo, lasgrid but most importantly lasoverlap which would reveal the typical flight line misalignments that we can find in most airborne surveys.
lasoverlap ^
-i Samara\Drone\00_raw\*.laz -faf ^
-step 1 -min_diff 0.1 -max_diff 0.2 -no_over ^
-o Samara\Drone\01_quality\overlap_10_20_before.png
As usually, I contacted Andre Jalobeanu from Bayesmap and asked for help with the alignment. A few days later he returned a new and improved set of flight lines to me that he had run through his stripalign software. So I performed the same quality check with lasoverlap once again.
lasoverlap ^
-i Samara\Drone\00_raw_aligned\*.laz -faf ^
-step 1 -min_diff 0.1 -max_diff 0.2 -no_over ^
-o Samara\Drone\01_quality\overlap_10_20_after.png
Vertical differences of more than 20 centimeters are mapped to saturated blue and red and the improvement in alignment though stripalign is impressive. Note that the road is not white because it was perfectly aligned but because there was no data. A few days before the scan, Samara got a new tar road from the municipality. As we were flying at 70 meters above ground – a little too high for this LiDAR scanner – we did not capture surfaces with low reflectivity and the fresh black tar did not reflect enough photons.
The Velodyne HDL-32E scanner rotates shooting laser beams from 32 different heads and the information which return comes from which head was stored into the “user data” field and into the “point_source ID” field of each point by the exporting software. The LAS format does not have a dedicated field for this information as the supported maximum number of different laser beams is 4 in the new point types 6 through 10 of the latest LAS 1.4 specification (where this field is called the “scanner channel”). But when we use lastile to turn the flight lines into square tiles we will override the “point_source ID” with the flight line number. Also the “user data” field is a fragile place to store important information as lasheight, for example, will store temporary data there. The “extra bytes” concept of the LAS format is perfect to store such an additional attribute and the ASPRS LAS Working Group is currently discussing to have standardized “extra bytes” for exactly such laser beam IDs.
We at rapidlasso have already implemented this a while ago into our LAStools software. So before processing the data any further we copy the beam index that ranges from 0 to 31 from the “user data” field into an unsigned 8 bit integer “extra byte” with these two las2las commands.
las2las ^
-i Samara\Drone\00_raw_aligned\*.laz
^-add_attribute 1 "laser beam ID" "which beam ranged this return" ^
-odir Samara\Drone\00_raw_temp -olaz
las2las ^
-i Samara\Drone\00_raw_temp\*.laz ^
-copy_user_data_into_attribute 0 ^
-set_user_data 0
^-set_point_source 0
^-odir Samara\Drone\00_raw_ready -olaz
In a future article we will process these aligned and prepared flight lines into a number of products. We thank Nelson Mattie from LiDAR Latinoamerica, Luis Hernandez Perez and Andre Jalobeanu from Bayesmap to help me acquire and fix this data. Several others helped with experiments using their own software and data or contributed otherwise to the discussions in the LAStools user forum. Thanks, guys. This data will soon be available as open data but a sample lasinfo report is already below.
lasinfo (210128) report for 'Samara\Drone\00_raw_ready\flightline_01.laz'
reporting all LAS header entries:
file signature: 'LASF'
file source ID: 1
global_encoding: 0
project ID GUID data 1-4: 00000000-0000-0000-0000-000000000000
version major.minor: 1.2
system identifier: 'LAStools (c) by rapidlasso GmbH'
generating software: 'las2las (version 210315)'
file creation day/year: 224/2019
header size: 227
offset to point data: 527
number var. length records: 2
point data format: 1
point data record length: 29
number of point records: 6576555
number of points by return: 6576555 0 0 0 0
scale factor x y z: 0.001 0.001 0.001
offset x y z: 661826 1092664 14
min x y z: 660986.622 1092595.013 11.816
max x y z: 661858.813 1092770.575 95.403
variable length header record 1 of 2:
reserved 0
user ID 'ScanLook'
record ID 25
length after header 0
description 'ScanLook Point Cloud'
variable length header record 2 of 2:
reserved 0
user ID 'LASF_Spec'
record ID 4
length after header 192
description 'by LAStools of rapidlasso GmbH'
Extra Byte Descriptions
data type: 1 (unsigned char), name "laser beam ID", description: "which beam ranged this return", scale: 1 (not set), offset: 0 (not set)
LASzip compression (version 3.4r3 c2 50000): POINT10 2 GPSTIME11 2 BYTE 2
reporting minimum and maximum for all LAS point record entries …
X -839378 32813
Y -68987 106575
Z -2184 81403
intensity 4 255
return_number 1 1
number_of_returns 1 1
edge_of_flight_line 0 0
scan_direction_flag 0 0
classification 1 1
scan_angle_rank -12 90
user_data 0 0
point_source_ID 0 0
gps_time 537764.192416 537854.547507
attribute0 0 31 ('laser beam ID')
number of first returns: 6576555
number of intermediate returns: 0
number of last returns: 6576555
number of single returns: 6576555
overview over number of returns of given pulse: 6576555 0 0 0 0 0 0
histogram of classification of points:
6576555 unclassified (1)
Strip Aligning of Drone LiDAR flown with Livox MID-40 over destroyed Mangrove
September 11th 2020 seemed like a fitting day to hunt down – with a powerful drone – those who destroy our common good. The latest DJI M300 RTK drone came to visit me in Samara, Guanacaste, Costa Rica and it was carrying the gAirHawk GS-MID40 UAV laser scanning system by Geosun featuring the light-weight Livox Mid 40 LiDAR. The drone is owned and operated by my friends at LiDAR Latinoamerica.
We flew a two-sortie mission covering a destroyed mangrove lagoon that was illegally poisoned, cut-down and filled in with the intention to construct a fancy resort in its place some 25 years ago. For future environmental work I wanted to get a high-resolution baseline scan with detailed topography of the meadow and what now-a-days remains of the mangroves that are part of the adjacent “Rio Lagarto” estuary. Recently the area was illegally treated with herbicides to eliminate the native herbs and the wild flowers and improve grazing conditions for cattle. Detailed topography can show how the heavy rains have washed these illegal substances into the ocean and further prove that the application of agro-chemicals in this meadow causes harm to marine life.
Here you can see a sequence of video about the LiDAR system, the preparations and the survey flights. Shortly after the flight I obtained the LiDAR from Nelson Mattie, the CEO of LiDAR Latinoamerica and ran the usual quality checks with LAStools.
lasinfo ^
-i Samara\Livox\00_raw_laz\*.laz ^
-histo intensity 16 ^
-histo gps_time 10 ^
-histo z 5 ^
-odir Samara\Livox\01_quality -odix _info -otxt ^
-cores 3
lasgrid ^
-i Samara\Livox\00_raw_laz\*.laz ^
-utm 16north ^
-merged ^
-keep_last ^
-step 0.5 ^
-density ^
-false -set_min_max 100 1000 ^
-odir Samara\Livox\01_quality ^
-o density_050cm_100_1000.png
For the density image, lasgrid counts how many last return from all flight lines fall into each 50 cm by 50 cm area, computes the desnity per square meter and maps this number to a color ramp that goes from blue via cyan, yellow and orange to red. The overall density of our survey is in the hundred of laser pulses per square meters with great variations where flight line overlap and at the survey boundary. The start and landing area as well as the place where the first flight ended and the second flight started are the two red spots of maximum density that can easily be picked out.

lasoverlap ^
-i Samara\Livox\00_raw_laz\*.laz ^
-utm 16north ^
-merged -faf ^
-step 0.5 ^
-min_diff 0.10 -max_diff 0.25 ^
-elevation -lowest ^
-odir Samara\Livox\01_quality ^
-o overlap_050cm_10cm_25cm.png
For the overlap image lasoverlap counts how many different flight lines overlap each 50 cm by 50 cm area and maps the counter to a color: 1 = blue, 2 = cyan, 3 = yellow, 4 = orange, and 5 of more = red. Here the result suggests that the 27 delivered LAS files do not actually correspond to the logical flight lines but that the files are chopped up in some other way. We will have Andre Jalobeanu from Bayesmap repair this for us later.

For the difference image, lasoverlap finds the maximal vertical difference between the lowest points from all flight lines that overlap for each 50 cm by 50 cm area and maps it to a color. If this difference is less than 10 cm, the area is colored white. Differences of 25 cm or more are colored either red or blue. All open areas such as roads, meadows and rooftops should be white here we definitely have way to much red and blue in the open areas.
There is way too much red and blue in areas that are wide open or on roof tops. We inspect this in further detail by taking a closer look at some of these red and blue areas. For this we first spatially index the strips with lasindex so that area-of-interest queries are accelerated, then load the strips into the GUI of lasview and add the difference image into the background via the overlay option.
lasindex ^
\
-i Samara\Livox00_raw_laz\*.laz ^
-tile_size 10 -maximum -100 ^
-cores 3
lasview ^
\
-i Samara\Livox00_raw_laz\*.laz ^
-gui
This way is easy to lasview or clip out (with las2las) those areas that look especially troublesome. We do this here for the large condominium “Las Palmeras” whose roofline and pool provide perfect features to illustrate the misalignment. As you can see in the image sequence below, there is a horizontal shift of up to 1 meter that can be nicely visualized with a cross section drawn perpendicular across the gable of the roof and – due to the inability to get returns from water – in the area without points where the pool is.
The misalignments between flight lines are too big for the data to be useful as is, so we do what we always do when we have this problem: We write an email to Andre Jalobeanu from Bayesmap and ask for help.
After receiving the LAZ files and the trajectory file Andre repaired the misalignment in two steps. The first call to his software stripalign in mode ‘-cut’ recovered a proper set of flight lines and removed most of the LiDAR points from the moments when the drone was turning. The second call to his software stripalign in mode ‘-align’ computed the amount of misalignment in this set of flight lines and produced a new set of flight lines with these errors corrected as much as possible. The results are fabulous.
lasmerge ^
-i Samara_MID40\*.laz ^
-o samaramid40.laz
stripalign ^
-uav -cut ^
-i samaramid40.laz ^
-po Samara_MID40\*.txt -po_parse ntxyzwpk ^
-G2 -cut_dist 50 ^
-O Samara_MID40\cut
stripalign ^
-uav -align ^
-i Samara_MID40\cut\*.laz ^
-po Samara_MID40\*.txt -po_parse ntxyzwpk ^
-A -G2 -full -smap -rmap -sub 2 ^
-O Samara_MID40\corr
As you can see above, the improvements are incredible. The data seems now sufficiently aligned to be useful for being processed into a number of products. One last thing to do is the removal of spurious scan lines that seem to stem from an unusual movement of the drone, like the beginning or the end of a turn.
We use lasview with option ‘-load_gps_time’ to determine the GPS time stamps of these spurious scan lines and remove them manually using las2las with option ‘-drop_gps_time_between t1 t2’ or similar. As the points are ordered in acquisition order, we can simply replay the flight by pressing ‘p’ and step forward and backward with ‘s’ and ‘S’.

Once we determined a suitable set of GPS times to remove from a flight lines we first verify our findings once more visually using lasview before actually creating the final cut with las2las.
lasview ^
\
-i Samara\Livox02_strips_aligned\samaramid40_c_13_i_13.laz ^
-drop_gps_time_below 283887060 ^
-drop_gps_time_above 283887123 ^
-filtered_transform ^
-set_classification 8 ^
=color_by_classification

las2las ^
\
-i Samara\Livox02_strips_aligned\samaramid40_c_13_i_13.laz ^
-drop_gps_time_below 283887060 ^
-drop_gps_time_above 283887123 ^
-odix _cut -olaz
After spending several hours of manually removing these spurious scan lines as well as deciding to remove a few short scan lines in areas of exzessive overlap we have a sufficiently aligned and cleaned data set to start the actual post-processing.
A big “Thank You!” to Andre Jalobeanu from Bayesmap for his help in aligning the data and to Nelson Mattie from LiDAR Latinoamerica for bringing his fancy drone to Samara. You can download the data here.

Swiss add liberal “Open LiDAR” and break with conservative stereotypes like Bank Secrecy, Yodeling and Punctuality
My new favorite Swiss Miss, Viola Amherd, said today “Geodaten gehören heute zur Infrastruktur wie die Straßen und die Eisenbahn”, which says that “today, geodata are part of the infrastructure like roads and railways”. The federal councilor of Switzerland announced that programmers and planners, whether private or professional, can now download the data free of charge and use it for their projects. There are almost no limits to innovation for information projects.
Hello Germany? Hello Bavaria? Hello Austria? Where is your Open-Data-Autobahn? Whatever happened to unlimited speeds and endless Fahrvergnügen … (-;
This means in Switzerland large amounts of LiDAR are now available for download from this portal here. There is amazing data there, including high-resolution ortho-photos and land cover data. However. we went straight for the LAS files and got ourselves a few tiles near the Sazmartinshorn in the example below.
The tiles can be selected in a number of ways via an interactive map and come as individually zipped LAS files. We downloaded the nine tiles indicated above (2.16 GB), unzipped the bulky LAS files (4.78 GB) and compressed them to the compact LAZ format (638 MB) with laszip. Using LAZ instead of zipped LAS would lower storage size and transmission bandwidth by a factor of 3.5. Something the stereotypically frugal people of Switzerland may want to consider … (-;
Then we process the data with a few typical command lines with the result shown below. The first uses blast2dem to create a hill shaded 1 meter DTM from the points classified as ground.
blast2dem ^
-i swisssurface3d_laz\*.laz -merged ^
-keep_class 2 -thin_with_grid 0.5 ^
-step 1.0 -hillshade ^
-o swiss_dtm_1m_hillshade.jpg
We use lasgrid to visualize the varying last return density per 2 meter by 2 meter area across the surveyed area with a false coloring that maps 10 or fewer pulses per square meter to blue and 20 or more pulses per square meters to red.
lasgrid ^
-i swisssurface3d_laz\*.laz -merged ^
-keep_last ^
-step 2.0 ^
-density ^
-false -set_min_max 10 20 ^
-o swiss_dendity_2m_10_20.jpg
A lasinfo report reveals that the scanner used was a RIEGL and the returning pulse width was quantized in tenths of a nanosecond into the user data field. We use lasgrid to visualize the range of the pulse width between 4.0 and 6.0 nanoseconds with a false coloring. Make sure to drop the points for which no pulse width was recorded (i.e. user data is zero) to avoid artifacts in the visualization.
lasgrid ^
-i swisssurface3d_laz\*.laz -merged ^
-drop_user_data 0
-keep_last ^
-step 2.0 ^
-user_data -lowest ^
-false -set_min_max 40 60 ^
-o swiss_pulsewidth_40_60.jpg

Finally we created a portal with laspublish to visualize the point cloud data interactively with Potree. The four screenshots below highlight only a few of the abilities for visualizing and measuring the point cloud.
laspublish ^
-i swisssurface3d_laz\*.laz ^
-elevation ^
-odir swisssurface3d_portal ^
-title Sazmartinshorn ^
-o sazmartinshorn.html ^
-olaz -overwrite
The open data license can be found here and we are hereby naming the source.

Scanning Samara with LiDAR and Images using Helicopter of Stereocarto
After weeks of planning the helicopter finally came. Since we have been cooperating with Stereocarto‘s San Jose office for quite some time, we were finally able to lure their helicopter down to Samara. Taking off from Liberia it flew about an hour back and forth across Samara and the adjacent beaches. Processing this data will take weeks (if not months) to complete and analyzing it even longer. One of the first applications will be the usual extraction of the bare earth terrain topography followed by the detection of large changes in comparison to our 2012 data. Such changes might be the result of permitted construction or earthworks, but are often illegal remodeling or removal of mountains to “create” building lots or filling in wetland areas with dirt and debris to “gain” land. Finding and quantifying such environmental damages and reporting about the process will be the first primary objective of this project.

