Converting Rasters from inefficient ASCII XYZ to more compact LAZ or TIF Formats

The German state of Brandenburg has recently started to provide many of their basic geospatial data as open data, such as digital ortophotos in TIF and JPG formats, vertical and horizontal control points in gzipped XML format, LOD1 and LOD2 building models in zipped GML format, topographic maps from 1:10000 to 1:100000 in zipped TIF and PDF formats, cadastral data in zipped XML and TIF formats, as well as LiDAR-derived 1m DTM rasters and image-derived 1m DSM rasters both in zipped XYZ ASCII format. All this data is provided with the user-friendly license called “Datenlizenz Deutschland Namensnennung 2.0“. In this article we show how to convert the 1m DTM rasters and the 1m DSM rasters  from verbose XYZ ASCII to more compact LAZ or TIF rasters.

brandenburg_dgm_258_5888_4000

Four 2000 by 2000 meter tiles of the Brandenburg 1m DTM. 

One particularity about most official German and Austrian rasters (anywhere else?) is that they sample the elevations in the corners rather than in the center of each raster cell. Here a one square kilometer raster tile of 1 meter resolution will have 1001 columns by 1001 rows instead of the more familiar 1000 by 1000 layout. While this corner-based representation does have some benefits, we convert these rasters in to the more common area-based representation using new functionality recently added to lasgrid.

After downloading one sample DTM tile such as dgm_33250-5886.zip we find three files in the zip folder. Two files with meta data and license information and the actual data file, which is a 2 km by 2km corner-based raster tile called “dgm_33250-5886.xyz” with 2001 columns by 2001 rows. Here is how the 4004001 lines looks:

more DGM_33250-5886.xyz
250000.0 5886000.0 15.284
250001.0 5886000.0 15.277
250002.0 5886000.0 15.273
250003.0 5886000.0 15.275
250004.0 5886000.0 15.289
250005.0 5886000.0 15.314
[...]
251994.0 5888000.0 13.565
251995.0 5888000.0 13.567
251996.0 5888000.0 13.565
251997.0 5888000.0 13.565
251998.0 5888000.0 13.564
251999.0 5888000.0 13.564
252000.0 5888000.0 13.565

The first step is to convert these XYZ rasters to LAZ format. We do this with txt2las as shown below. In case the vertical datum is the “Deutsches Haupthoehennetz 2016” we should also add ‘-vertical_dhhn2016’ but not sure at the moment:

txt2las -i dgm\*.xyz ^
        -set_scale 1.0 1.0 0.001 ^
        -epsg 25833 ^
        -odir temp -olaz ^
        -cores 4

For 84 files this reduces the size by a factor of 31 or compresses it down to 3.2 percent of the original, namely from 8.45 GB for raw XYZ to 277 MB for LAZ. So far we have really just converted a list of x, y and z coordinates from verbose ASCII to more compact LAZ. We can easily go back to ASCII with las2txt whenever needed:

txt2las -i temp\*.laz ^
        -odir ascii -otxt ^
        -cores 4

Next we use lasgrid to convert from a corner-based raster to an area-based raster using the new option ‘-subsquare 0.2’ which replaces each input point by four points that are displaced by all possibilities of adding +/- 0.2 in x and y. We then average the exactly four points that fall into each relevant raster cell with option ‘-average’ and clip the output to the meaningful 2000 columns by 2000 rows with ‘-use_tile_size 2000’. You need to get the most recent version of LAStools to have these options.

lasgrid -i temp\*.laz ^
        -subsquare 0.25 ^
        -step 1 -average ^
        -use_tile_size 2000 ^
        -odir dgm -olaz ^
        -cores 4

Instead of RasterLAZ you can also choose the TIF, BIL, IMG, or ASC format here. The final result are standard 1 meter elevation products with 2000 columns by 2000 rows with the averaged elevation sample being associated with the center of the raster cell. The lasinforeport for a sample tile is shown at the end of this article.

You may proceed to optimize the RasterLAZ for area-of-interest queries by reordering the raster into a space-filling curve with lassort or lasoptimize and compute a spatial index. You may also classify the RasterLAZ elevation samples, for example, into building, high, medium, and low vegetation, ground, and other common classifications with lasclip or lascolor. You may also add RGB or intensity values to the RasterLAZ elevation samples using the orthophotos that are also available as open data with lascolor. These are some of the benefits of RasterLAZ beyond efficient storage and access.

We like to acknowledge the LGB (Landesvermessung und Geobasisinformation Brandenburg) for providing state-wide coverage of their geospatial data holdings as easily downloadable open data with the user-friendly Deutschland Namensnennung 2.0 license. But we also would like to ask to please add the raw LiDAR point clouds to the open data portal. The storage savings in going from ASCII XYZ to LAZ for the DTM and DSM rasters should  free enough space to host the LiDAR … (-;

lasinfo (200112) report for 'dgm_33\DGM_33250-5886.laz'
reporting all LAS header entries:
  file signature:             'LASF'
  file source ID:             0
  global_encoding:            0
  project ID GUID data 1-4:   00000000-0000-0000-0000-000000000000
  version major.minor:        1.2
  system identifier:          'raster compressed as LAZ points'
  generating software:        'LAStools (c) by rapidlasso GmbH'
  file creation day/year:     13/20
  header size:                227
  offset to point data:       455
  number var. length records: 2
  point data format:          0
  point data record length:   20
  number of point records:    4000000
  number of points by return: 4000000 0 0 0 0
  scale factor x y z:         0.5 0.5 0.001
  offset x y z:               200000 5800000 0
  min x y z:                  250000.5 5886000.5 13.419
  max x y z:                  251999.5 5887999.5 33.848
variable length header record 1 of 2:
  reserved             0
  user ID              'Raster LAZ'
  record ID            7113
  length after header  80
  description          'by LAStools of rapidlasso GmbH'
    ncols   2000
    nrows   2000
    llx   250000
    lly   5886000
    stepx    1
    stepy    1
    sigmaxy <not set>
variable length header record 2 of 2:
  reserved             0
  user ID              'LASF_Projection'
  record ID            34735
  length after header  40
  description          'by LAStools of rapidlasso GmbH'
    GeoKeyDirectoryTag version 1.1.0 number of keys 4
      key 1024 tiff_tag_location 0 count 1 value_offset 1 - GTModelTypeGeoKey: ModelTypeProjected
      key 3072 tiff_tag_location 0 count 1 value_offset 25833 - ProjectedCSTypeGeoKey: ETRS89 / UTM 33N
      key 3076 tiff_tag_location 0 count 1 value_offset 9001 - ProjLinearUnitsGeoKey: Linear_Meter
      key 4099 tiff_tag_location 0 count 1 value_offset 9001 - VerticalUnitsGeoKey: Linear_Meter
LASzip compression (version 3.4r3 c2 50000): POINT10 2
reporting minimum and maximum for all LAS point record entries ...
  X              100001     103999
  Y              172001     175999
  Z               13419      33848
  intensity           0          0
  return_number       1          1
  number_of_returns   1          1
  edge_of_flight_line 0          0
  scan_direction_flag 0          0
  classification      0          0
  scan_angle_rank     0          0
  user_data           0          0
  point_source_ID     0          0
number of first returns:        4000000
number of intermediate returns: 0
number of last returns:         4000000
number of single returns:       4000000
overview over number of returns of given pulse: 4000000 0 0 0 0 0 0
histogram of classification of points:
         4000000  never classified (0)

Removing Noise from Single Photon LiDAR to Generate a Smooth DTM

A while back we had a first look at the Single Photon LiDAR from Leica’s SPL100 sensor (that eventually turned out just to be an SPL99 because one beamlet or one receiver in the 10 by 10 array was broken and did not produce any returns). Today we are taking a closer look at a strategy to remove the excessive noise in the raw Single Photon LiDAR data from a “proper” SPL100 sensor (where all of the 100 beamlets are firing) that was flown in 2017 in Navarra, Spain.

navarra_spl_teaser

Profile through original points on top of generated DTM.

The data was provided as open data by the cartography section of Navarra’s Government and is available via a simple download FTP portal. We describe the LAStools processing steps that were used to eliminate the excessive noise and to generate a smooth DTM. In the following we are using the originally released version of the data, that we obtained shortly after the portal went online that seems to be a bit more “raw” than the current files available now. One starndard quality check with lasinfo was done with:

lasinfo -i 0_raw\*.laz ^
        -cd ^
        -histo intensity 1 ^
        -histo user_data 1 ^
        -histo point_source 1 ^
        -histo gps_time 10 ^
        -odir 1_quality -odix _info -otxt

Upon inspecting the lasinfo report we suggest a few changes in how to store this Single Photon LiDAR data for more efficient hosting via an online portal. We perform these changes here before starting the actual processing. First we use the las2las call shown below to fix an error in the global encoding bits, remove an irrelevant VLR, re-scale the coordinates from millimeter to centimeters, re-offset the coordinates to nice numbers, and – what is by far the most crucial change for better compression – remap the beamlet ID stored in the ‘user data’ field as described in an earlier article.

las2las -i 0_raw\*.laz ^
        -rescale 0.01 0.01 0.01 ^
        -auto_reoffset ^
        -set_global_encoding_gps_bit 1 ^
        -remove_vlr 1 ^
        -map_user_data beamlet_ID_map.txt ^
        -odir 2_fix_rescale_reoffset_remap -olaz ^
        -cores 3

Then we use two lassort calls, one to maximize compression and one to improve spatial coherence. One lassort call rearranges the points in increasing order first based on the GPS time stamps, then breaks ties based on the user data field (that stores the beamlet ID), and finally stores the returns of every beamlet ordered by return number. We also add spatial reference information in this step. The other lassort call rearranges the points into a spatially coherent layout. It uses a Z-order sort with the granularity of 50 meter by 50 meter buckets of points. Within each bucket the point order from the prior sort is kept.

lassort -i 2_fix_rescale_reoffset_remap\*.laz ^
        -epsg 25830 ^
        -gps_time ^
        -user_data ^
        -return_number ^
        -odir 2_maximum_compression -olaz ^
        -cores 3

lassort -i 2_maximum_compression\*.laz ^
        -bucket_size 50 ^
        -odir 2_spatial_coherence -olaz ^
        -cores 3

The resulting optimized nine tiles are around 200 MB each and can be downloaded as one file here or as individual tiles here:

Now we start the usual processing workflow by tiling the data with lastile into smaller 500 meter by 500 meter tiles with a 25 meter buffer. We also set the pre-existing point classification in the data to zero as we will compute our own later.

lastile -i 2_spatial_coherence\*.laz ^
        -set_classification 0 ^
        -tile_size 500 -buffer 25 -flag_as_withheld ^
        -odir 3_buffered -o yecora.laz

We notice that a large amount of the noise has intensity values below 1000. We are still a bit puzzled where those intensity values come from and what exactly they mean in a Single Photon LiDAR system. But it works. We run las2las with a “filtered transform” to set classification of all points whose intensity value is 1000 or less to the classification code 7 (aka “noise”).

las2las -i 3_buffered\*.laz ^
        -keep_intensity_below 1000 ^
        -filtered_transform ^
        -set_classification 7 ^
        -odir 4_intensity_denoised -olaz ^
        -cores 3

We then ignore this “easy-to-identify” noise and go after the remaining one with lasnoise by ignoring classification code 7 and setting the newly identified noise to classification code 9 – not because it’s “water” (the usual meaning of class 9) but because these points are drawn with a distinct blue color when checking the result with lasview.

 lasnoise -i 4_intensity_denoised\*.laz ^
         -ignore_class 7 ^
         -step_xy 1.0 -step_z 0.2 ^
         -isolated 5 ^
         -classify_as 9 ^
         -odir 4_isolation_denoised -olaz ^
         -cores 3

Of the surviving non-noise points we then use lasthin to reclassify the point closest to the 20th elevation percentile per 50 cm by 50 cm area with classification code 8 (for all areas that have more than 5 non-noise points per 50 cm by 50 cm area. We repeat the same for every 1 meter by 1 meter area.

lasthin -i 4_isolation_denoised\*.laz ^
        -ignore_class 7 9 ^
        -step 0.5 -percentile 20 5 ^
        -classify_as 8 ^
        -odir 5_thinned_p20_050cm -olaz ^
        -cores 3

lasthin -i 5_thinned_p20_050cm\*.laz ^
        -ignore_class 7 9 ^
        -step 1.0 -percentile 20 5 ^
        -classify_as 8 ^
        -odir 5_thinned_p20_100cm -olaz ^
        -cores 3

We then perform a more agressive second noise removal step one with lasnoise using only those points with classification code 8, namely those non-noise points that were the 20th elevation percentile in either a 50 cm by 50 cm cell or a 1 meter by 1 meter cell. This can be done by ignoring classification code 0, 7, and 9. We mark those noise points as 6 so they appear orange in the point cloud with lasview.

lasnoise -i 5_thinned_p20_100cm\*.laz ^
         -ignore_class 0 7 9 ^
         -step_xy 2.0 -step_z 0.2 ^
         -isolated 1 ^
         -classify_as 6 ^
         -odir 5_thinned_p20_100cm_denoised -olaz ^
         -cores 3

The 20th elevation percentile points that survive the last noise removal are then classified into ground (2) and non-ground (1) points with lasground_new by ignoring all other points, namely those with classification codes 0, 6, 7, and 9.

lasground_new -i 5_thinned_p20_100cm_denoised\*.laz ^
              -ignore_class 0 6 7 9 ^
              -town ^
              -odir 5_tiles_ground_050cm -olaz ^
              -cores 3

These images below illustrate the steps we took. They also show that not all data was used and might give you ideas where to tweak our workflow for even better results.

Finally we raster the ground points into 1 meter Digital Terrain Model (DTM) rasters with las2dem and store the result (without buffers) to the RasterLAZ format.

las2dem -i 5_tiles_ground_050cm\*.laz ^
        -keep_class 2 ^
        -step 1.0 ^
        -use_tile_bb ^
        -odir 6_tiles_dtm_100cm -olaz ^
        -cores 3

Finally we merged all RasterLAZ tiles into one and compute the final hillshaded DTM with blast2dem.

blast2dem -i 6_tiles_dtm_100cm\*.laz -merged ^
          -step 1.0 ^
          -hillshade ^
          -o yecora_dtm_100cm.png

The hillshaded DTM that is result of the entire sequence of processing steps described above is shown below.

DTM from ground classification created with LAStools

For comparison we generate the same DTM using the originally provided classification. According to the README file the original ground points are classified with code 22 in areas of flight line overlap and as the usual code 2 elsewhere. Hence we must use both classification codes to construct the DTM. We do this analogue to the earlier processing steps with the three LAStools commands lastile, las2dem, and blast2dem below.

lastile -i 2_spatial_coherence\*.laz ^
        -tile_size 500 -buffer 25 -flag_as_withheld ^
        -odir 3_tiles_buffered_orig -o yecora.laz

las2dem -i 3_tiles_buffered_orig\*.laz ^
        -keep_class 2 22 ^
        -step 1.0 ^
        -use_tile_bb ^
        -odir 6_tiles_dtm_100cm_orig -olaz ^
        -cores 3

blast2dem -i 6_tiles_dtm_100cm_orig\*.laz -merged ^
          -step 1.0 ^
          -hillshade ^
          -o yecora_dtm_100cm_orig.png

Below the hillshaded DTM generated from the ground classification that was provided with the LiDAR when it was originally released as open data.

DTM from ground classification of originally released data.

In the meantime Andorra’s SPL data have been updated with a newer version in the open data portal. The new version of the data contains a much better ground classification that might have been improved manually as the new files now have the the string ‘cam’ instead of ‘ca’ in the file name, which probably means ‘classified automatically and manually’ instead of the original ‘classified automatically’. We decided not to switch to the new data release as it seemed less “raw” than the original release. For example there are suddenly points with GPS times and returns counts and numbers of zero in the file that seem synthetic. But we also computed the hillshaded DTM for the new release which is shown below.

DTM from ground classification of newly released data.

We thank the cartography section of Navarra’s Government for providing their LiDAR as open data. This not only allows re-purposing expensive data paid for by public taxes but also generates additional value, encourages citizen science, and provides educational opportunity and insights such as this blog article.

Another European Country Opens LiDAR: Welcome to the Party, Slovakia!

We got a little note from Vítězslav Moudrý from CULS pointing out that the Geodesy, Cartography and Cadastre Authority of the Slovak Republic has started releasing LiDAR as open data on their interactive Web portal. Congratulations, Slovakia!!! Welcome to the Open Data Party!!! We managed to download some data starting from this Web portal link and describe the process of obtaining LiDAR data from the Low Tatras mountain range in central Slovakia with pictures below.

open_data_portal_slovakia_01

(1) click the new “data export” link

open_data_portal_slovakia_02

(2) change the export selection to “Shape”

open_data_portal_slovakia_03

(3) change the file format to “LAZ”

open_data_portal_slovakia_04

(4) zoom to a colored area-of-interest

open_data_portal_slovakia_05

(5) zoom further and draw a nice polygon

open_data_portal_slovakia_06

(6) edit polygon into nice shape and realize heart is red because area is too big

open_data_portal_slovakia_07

(7) zoom further and draw polygon smaller than 2 square kilometer

open_data_portal_slovakia_08

(8) when polygon turns green, accept license, enter email address and export

open_data_portal_slovakia_09

(9) short wait and you get download link to such an archive

open_data_portal_slovakia_10

(10) license conditions: PDF auto-translated from Slovak to English

 

open_data_portal_slovakia_11

(11) LiDAR are spatially indexed flight lines clipped to area-of-interest

open_data_portal_slovakia_12_density_all_returns_20_50

(12) all return density: blue = 20 and red = 50 returns per square meter

lasgrid -i LowTatras\*.laz -merged ^
        -step 2 -point_density_16bit ^
        -false -set_min_max 20 50 ^
        -o LowTatras\density_all_returns_20_50.png
open_data_portal_slovakia_13_density_last_returns_4_40

(13) last return density: blue = 4 and red = 40 last returns per square meter

lasgrid -i LowTatras\*.laz -merged ^
        -keep_last ^
        -step 2 -point_density_16bit ^
        -false -set_min_max 4 40 ^
        -o LowTatras\density_last_returns_4_40.png
open_data_portal_slovakia_14_density_ground_returns_4_40

(14) ground return density: blue = 4 and red = 40 ground returns per square meter

lasgrid -i LowTatras\*.laz -merged ^
        -keep_classification 2 ^
        -step 2 -point_density_16bit ^
        -false -set_min_max 4 40 ^
        -o LowTatras\density_ground_returns_4_40.png
open_data_portal_slovakia_14_overlap_10cm_20cm_diff

(15) flight line difference image: white <= +/- 10 cm and red/blue >= +/- 20 cm

lasoverlap -i LowTatras\*.laz -faf ^
           -drop_classification 7 18 ^
           -min_diff 0.1 -max_diff 0.2 ^
           -o LowTatras\overlap_10cm_20cm.png

Finally we compute a DSM and a corresponding DTM using the already existing ground classification with BLAST using the command sequence shown below.

 

lasthin -i LowTatras\*.laz -merged ^
        -drop_classification 7 18 ^
        -step 0.5 -highest ^
        -o LowTatras\highest_50cm.laz

blast2dem -i LowTatras\highest_50cm.laz ^
          -hillshade ^
          -o LowTatras -o dsm_1m_hillshaded.png

blast2dem -i LowTatras\*.laz -merged ^
          -keep_classification 2 ^
          -thin_with_grid 0.5 ^
          -hillshade ^
          -o LowTatras\dtm_1m_hillshaded.png

We thank the Geodesy, Cartography and Cadastre Authority of the Slovak Republic for providing their LiDAR as open data for both commercial and non-commercial purposes and name the source of the data used above (as the license requires) as the Office of Geodesy, Cartography and Cadastre of the Slovak Republic (GCCA SR) or – in Slovak – the Úrad geodézie, kartografie a katastra Slovenskej republiky (ÚGKK SR).

Which European country goes next? Czech Republic? Poland? Hungary? Switzerland?

 

 

Completeness and Correctness of Discrete LiDAR Returns per Laser Pulse fired

Again and again we have preached about the importance of quality checking when you first get your expensive LiDAR data from the vendor or your free LiDAR data from an open data portal. The minimal quality check we usually advocate consists of lasinfo, lasvalidate, lasoverlap, and lasgrid. The information computed by these LAStools can reassure you that the data contains the right information, is specification conform, has properly aligned flight lines, and has the density distribution you expect. For deliveries or downloads in LAZ format we in addition recommend running laszip with the option ‘-check’ to find the rare file that might have gotten bit-corrupted or truncated during the transfer or the download. Today we learn about a more advanced quality check that can be done by running lassort followed by lasreturn.

For every laser shot fired there are usually between one to five discrete LiDAR returns and some full-waveform systems may even deliver up to fifteen returns. Each of these one to fifteen returns is then given the exact same GPS time stamp that corresponds to the moment in time the laser pulse was fired. By having these unique GPS time stamps we can always recover the set of returns that come from the same laser shot. This makes it possible to check completeness (are all the returns in the file) and correctness (is the returns numbering correct) for the discrete returns of each laser pulse.

optech_galaxy_issue

Showing all sets of returns in the file that do not have an unique GPS time stamp because the set has one or more duplicate returns (e.g. two first returns, two second returns, … ).

With LAStools we can do this by running lassort followed by lasreturn for any LiDAR that comes from a single beam system. For LiDAR that comes from some multi-beam system, such as the Velodyne 16, 32, 64, or 128, the Optech Pegasus, the RIEGL LMS 1560 (aka “crossfire”), or the Leica ALS70 or ALS80 we first need to seperate the files into one file per beam, which can be done with lassplit.  In the following we investigate data coming from an Optech Galaxy single-beam system. First we sort the returns by GPS time stamp using lassort (this step can be omitted if the data is already sorted in acquisition order (aka by increasing GPS time stamps)) and then we check the return numbering with lasreturn:

lassort -i L001-1-M01-S1-C1_r.laz -gps_time -odix _sorted -olaz

lasreturn -i L001-1-M01-S1-C1_r_sorted.laz -check_return_numbering
checked returns of 11809046 multi and 8585573 single return pulses. took 26.278 secs
missing: 0 duplicate: 560717 too large: 0 zero: 0
duplicate
========
200543 returns with n = 1 and r = 1 are duplicate
80548 returns with n = 2 and r = 1 are duplicate
80548 returns with n = 2 and r = 2 are duplicate
41962 returns with n = 3 and r = 1 are duplicate
41962 returns with n = 3 and r = 2 are duplicate
41962 returns with n = 3 and r = 3 are duplicate
13753 returns with n = 4 and r = 1 are duplicate
13753 returns with n = 4 and r = 2 are duplicate
13753 returns with n = 4 and r = 3 are duplicate
13753 returns with n = 4 and r = 4 are duplicate
3636 returns with n = 5 and r = 1 are duplicate
3636 returns with n = 5 and r = 2 are duplicate
3636 returns with n = 5 and r = 3 are duplicate
3636 returns with n = 5 and r = 4 are duplicate
3636 returns with n = 5 and r = 5 are duplicate
WARNING: there are 59462 GPS time stamps that have returns with different number of returns

The output we see above indicates a problem in the return numbering. A recently added new options to lasreturn that allow to reclassify those returns that seem to be part of a problematic set of returns that either contains missing returns, duplicate returns, or returns with different values for the “numbers of returns of given pulse” attribute. This allows us to visualize the issue with lasview. All returns whose are part of a problematic set is shown in the image above.

lasreturn -i L001-1-M01-S1-C1_r_sorted.laz ^
          -check_return_numbering ^
          -classify_as 8 ^
          -classify_duplicate_as 9 ^
          -classify_violation_as 7 ^
          -odix _marked -olaz

This command will mark all sets of returns (i.e. returns that have the exact same GPS time stamp) that have missing returns as 8, that have duplicate returns as 9, and that have returns which different “number of returns per pulse” attribute as 7. The data we have here has no missing returns (no returns are classified as 8) but we have duplicate (9) and violating (7) returns. We look at them closely in single scan lines to conclude.

It immediately becomes obvious that the same GPS time stamp was assigned to the returns of pair of subsequent shots. If the subsequent shots have the same number of returns per shot they are classified as duplicate (9 or blue). If the subsequent shots have different number of returns per shot they are marked as violating (7 or violett) but the reason for the issue is the same. We can look at a few of these return sets in ASCII. Here two subsequent four return shots that have the same GPS time stamp.

237881.011730 4 1 691602.736 5878246.425 141.992 6 79
237881.011730 4 2 691602.822 5878246.415 141.173 6 89
237881.011730 4 3 691603.051 5878246.389 138.993 6 44
237881.011730 4 4 691603.350 5878246.356 136.150 6 169
237881.011730 4 1 691602.793 5878246.439 142.037 6 114
237881.011730 4 2 691602.883 5878246.429 141.185 6 96
237881.011730 4 3 691603.109 5878246.404 139.033 6 50
237881.011730 4 4 691603.414 5878246.370 136.129 6 137

Here a four return shot followed by a three return shot that have the same GPS time stamp.

237881.047753 4 1 691603.387 5878244.501 140.187 6 50
237881.047753 4 2 691603.602 5878244.476 138.141 6 114
237881.047753 4 3 691603.776 5878244.456 136.490 6 60
237881.047753 4 4 691603.957 5878244.436 134.767 6 116
237881.047753 3 1 691603.676 5878244.492 138.132 6 97
237881.047753 3 2 691603.845 5878244.473 136.534 6 90
237881.047753 3 3 691604.034 5878244.452 134.739 6 99

It appears the GPS time counter in the LMS export software did not store the GPS time with sufficient resolution to always distinguish subsequent shots. The issue was confirmed by Optech and was already fixed a few months ago.

We should point out that these double-used GPS time stamps have zero impact on the geometric quality of the point cloud or the distribution of returns. The drawback is that not all returns can easily be grouped into one unique set per laser shot and that the files are not entirely specification conform. Any software that relies on accurate and unique GPS time stamps (such as flight line alignment software) may potentially struggle as well. The bug of the twice-used GPS time stamps was a discovery that is probably of such low consequence that no user of Optech Galaxy data had noticed it in the 4 years that Galaxy had been sold … until we really really scrutinized some data from one of our clients. Optech reports that the issue has been fixed now. But there are other vendors out there with even more serious GPS time and return numbering issues … to be continued.

Another German State Goes Open LiDAR: Saxony

Finally some really good news out of Saxony. 😊 After North Rhine-Westphalia and Thuringia released the first significant amounts of open geospatial data in Germany in a one-two punch in January 2017, we now have a third German state opening their entire tax-payer-funded geospatial data holdings to the tax-paying public via a simple and very easy-to-use online download portal. Welcome to the open data party, Saxony!!!

Currently available via the online portal are the LiDAR-derived raster Digital Terrain Model (DTM) at 1 meter resolution (DGM 1m) for everything flown since 2015 and and at 2 meter resolution (DGM 2m) or 20 meter resolution (DGM 20m) for the entire state. The horizontal coordinates use UTM zone 33 with ETRS89 (aka EPSG code 25833) and the vertical coordinate uses the “Deutsche Haupthöhennetz 2016” or “DHHN2016” (aka EPSG code 7837). Also available are orthophotos at 20 cm (!!!) resolution (DOP 20cm).

dgm_1000_rdax_87

Overview of current LiDAR holdings. Areas flown 2015 or later have LAS files and 1 meter rasters. Others have LiDAR as ASCII files and lower resolution rasters.

Offline – by ordering through either this online form or that online form – you can also get the 5 meter DTM and the 10 meter DTM, the raw LiDAR point clouds, LiDAR intensity rasters, hill-shaded DTM rasters, as well as the 1 meter and the 2 meter Digital Surface Model (DSM) for a small administrative fee that ranges between 25 EUR and 500 EUR depending on the effort involved.

Our immediate thought is to get a copy on the entire raw LiDAR points clouds (available as LAS 1.2 files for all  data acquired since 2015 and as ASCII text for earlier acquisitions) and find some portal willing to hosts this data online. We are already in contact with the land survey of Saxony to discuss this option and/or alternate plans.

Let’s have a look at the data. First we download four 2 km by 2 km tiles of the 1 meter DTM raster for an area surrounding the so called “Greifensteine” using the interactive map of the download portal, which are provided as simple XYZ text. Here a look at the contents of one ot these tiles:

more Greifensteine\333525612_dgm1.xyz
352000 5613999 636.26
352001 5613999 636.27
352002 5613999 636.28
352003 5613999 636.27
352004 5613999 636.24
[...]

Note that the elevation are not sampled in the center of every 1 meter by 1 meter cell but exactly on the full meter coordinate pair, which seems especially common  in German-speaking countries. Using txt2las we convert these XYZ rasters to LAZ format and add geo-referencing information for more efficient subsequent processing.

txt2las -i greifensteine\333*_dgm1.xyz ^
        -set_scale 1 1 0.01 ^
        -epsg 25833 ^
        -olaz

Below you see that going from XYZ to LAZ reduces the amount of  data from 366 MB to 10.4 MB, meaning that the data on disk becomes over 35 times smaller. The ability of LASzip to compress elevation rasters was first noted during the search for missing airliner MH370 and resulted in our new LAZ-based compressor for height grid called DEMzip.  The resulting LAZ files now also include geo-referencing information.

96,000,000 333525610_dgm1.xyz
96,000,000 333525612_dgm1.xyz
96,000,000 333545610_dgm1.xyz
96,000,000 333545612_dgm1.xyz
384,000,000 bytes

2,684,820 333525610_dgm1.laz
2,590,516 333525612_dgm1.laz
2,853,851 333545610_dgm1.laz
2,795,430 333545612_dgm1.laz
10,924,617 bytes

Using blast2dem we then create a hill-shaded version of the 1 meter DTM in order to overlay a visual representation of the DTM onto Google Earth.

blast2dem -i greifensteine\333*_dgm1.laz ^
          -merged ^
          -step 1 ^
          -hillshade ^
          -o greifensteine.png

Below the result that nicely shows how the penetrating laser of the LiDAR allows us to strip away the forest to see interesting geological features in the bare-earth terrain.

In a second exercise we use the available RGB orthophoto images to color one of the DTM tiles and explore it using lasview. For this we download the image for the top left of the four tiles that covers the area containing the “Greifensteine” from the interactive download portal for orthophotos. As the resolution of the TIF image is 20 cm and that of the DTM is only 1 meter, we first down-sample the TIF using gdalwarp of GDAL.

gdalwarp -tr 1 1 ^
         -r cubic ^
         greifensteine\dop20c_33352_5612.tif ^
         greifensteine\dop1m_33352_5612.tif

If you are not yet using GDAL today is a good day to start. It nicely complements the point cloud processing functionality of LAStools for raster inputs. Next we use lascolor to give each elevation pixel of the DTM stored in LAZ format its corresponding color from the orthophoto.

lascolor -i greifensteine\333525612_dgm1.laz ^
         -image greifensteine\dop1m_33352_5612.tif ^
         -odix _rgb -olaz

Now we can view the colored DTM in LAZ format interactively with lasview or any other LiDAR viewing software and turn on the RGB colors from the orthophoto as needed to understand the scene.

lasview -i greifensteine\333525612_dgm1_rgb.laz

We thank the “Staatsbetrieb Geobasisinformation und Vermessung Sachsen (GeoSN)” for giving us easy access to the 1 meter DTM and the 20 cm orthophoto that we have used in this article through their new open geodata portal as open data under the user-friendly license “Datenlizenz Deutschland – Namensnennung – Version 2.0.

National Open LiDAR Strategy of Latvia humiliates Germany, Austria, and other European “Closed Data” States

Latvia, officially the Republic of Latvia, is a country in the Baltic region of Northern Europe has around 2 million inhabitants, a territory of 65 thousand square kilometers and – since recently – also a fabulous open LiDAR policy. Here is a list of 65939 tiles in LAS format available for free download that cover the entire country with airborne LiDAR with a density from 4 to 6 pulses per square meters. The data is classified into ground, building, vegetation, water, low noise, and a few other classifications. It is licensed Creative Commons CC0 1.0 – meaning that you can copy, modify, and distribute the data, even for commercial purposes, all without asking permission. And there is a simple and  functional interactive download portal where you can easily download individual tiles.

latvia_open_data_portal_01

Interactive open LiDAR download portal of Latvia.

We downloaded the 5 by 5 block of square kilometer tiles matching “4311-32-XX.las” for checking the quality and creating a 1m DTM and a 1m DSM raster. You can follow along after downloading the latest version of LAStools.

Quality Checking

We first run lasvalidate and lasinfo on the downloaded LAS files and then immediately compress them with laszip because multi-core processing of uncompressed LAS files will quickly overwhelm our file system, make processing I/O bound, and result in overall longer processing times with CPUs waiting idly for data to be loaded from the drives.

lasinfo -i 00_tiles_raw\*.las ^
        -compute_density ^
        -histo z 5 ^
        -histo intensity 256 ^
        -histo user_data 1 ^
        -histo scan_angle 1 ^
        -histo point_source 1 ^
        -histo gps_time 10 ^
        -odir 01_quality -odix _info -otxt ^
        -cores 3
lasvalidate -i 00_tiles_raw\*.las ^
            -no_CRS_fail ^
            -o 01_quality\report.xml

Despite already excluding a missing Coordinate Reference System (CRS) from being a reason to fail (the lasinfo reports show that the downloaded LAS files do not have any geo-referencing information) lasvalidate still reports a few failing files, but scrutinizing the resulting XML file ‘report.xml’ shows only minor issues.

Usually during laszip compression we do not alter the contents of a file, but here we also add the EPSG code 3059 for CRS “LKS92 / Latvia TM” as we turn bulky LAS files into slim LAZ files so we don’t have to specify it in all future processing steps.

laszip -i 00_tiles_raw\*.las ^
       -epsg 3059 ^
       -cores 2

Compression reduces the total size of the 25 tiles from over 4.1 GB to below 0.6 GB.

Next we use lasgrid to visualize the last return density which corresponds to the pulse density of the LiDAR survey. We map each 2 by 2 meter pixel where the last return density is 2 or less to blue and each 2 by 2 meter pixel it is 8 or more to red.

lasgrid -i 00_tiles_raw\*.laz ^
        -keep_last ^
        -step 2 ^
        -density_16bit ^
        -false -set_min_max 2 8 ^
        -odir 01_quality -odix _d_2_8 -opng ^
        -cores 3

This we follow by the mandatory lasoverlap check for flight line overlap and alignment where we map the number of overlapping swaths as well as the worst vertical difference between overlapping swaths to a color that allows for quick visual quality checking.

lasoverlap -i 00_tiles_raw\*.laz ^
           -step 2 ^
           -min_diff 0.1 -max_diff 0.2 ^
           -odir 01_quality -opng ^
           -cores 3

The results of the quality checks with lasgrid and lasoverlap are shown below.

Raster Derivative Generation

Now we use first las2dem to create a Digital Terrain Model (DTM) and a Digital Surface Model (DSM) in RasterLAZ format and then use blast2dem to create merged and hill-shaded versions of both. Because we will use on-the-fly buffering to avoid edge effects along tile boundaries we first spatially index the data using lasindex for more efficient access to the points from neighboring tiles.

lasindex -i 00_tiles_raw\*.laz ^
         -cores 3

las2dem -i 00_tiles_raw\*.laz ^
        -keep_class 2 9 ^
        -buffered 25 ^
        -step 1 ^
        -use_orig_bb ^
        -odir Latvia\02_dtm_1m -olaz ^
        -cores 3

blast2dem -i 02_dtm_1m\*.laz ^
          -merged ^
          -hillshade ^
          -step 1 ^
          -o dtm_1m.png

las2dem -i 00_tiles_raw\*.laz ^
        -drop_class 1 7 ^
        -buffered 10 ^
        -spike_free 1.5 ^
        -step 1 ^
        -use_orig_bb ^
        -odir 03_dsm_1m -olaz ^
        -cores 3

blast2dem -i 03_dsm_1m\*.laz ^
          -merged ^
          -hillshade ^
          -step 1 ^
          -o dsm_1m.png

Because the overlaid imagery does not look as nice in our new Google Earth installation, below are the DTM and DSM at versions down-sampled to 25% of their original size.

Many thanks to SunGIS from Latvia who tweeted us about the Open LiDAR after we chatted about it during the Foss4G 2019 gala dinner. Kudos to the Latvian Geospatial Information Agency (LGIA) for implementing a modern national geospatial policy that created opportunity for maximal return of investment by opening the expensive tax-payer funded LiDAR data for re-purposing and innovation without barriers. Kudos!

Removing Low Noise in LiDAR Points with Median Ground Surface

Recently a user of LAStools asked a question in our user forum about how to classify LiDAR data that contains lots of low noise. A sample screen shot of the user’s failed attempt to correctly classify the noise using lasnoise and the ground with lasground is shown below: red points are noise, brown points are ground, and grey points are unclassified. In this article we show how to remove this low noise using a temporary ground surface that we construct from a subset of points at a certain elevation percentile. You can follow along by downloading the data and the sequence of command lines used.

example of miss-classified low noise points: ground points (brown) below ground

Download the LiDAR data set that was apparently flown with a RIEGL “crossfire” Q1560. You can also download the command line sequence here. We first run lasinfo with option ‘-compute_density’ (or ‘-cd’ for short) to get a rough idea about the last return density which is quite high with an average of over 31 last returns per square meter. We then use lasthin to classify one last return per square meter with the temporary classification code 8, namely the one whose elevation is closest to the 20th percentile per 1 meter by 1 meter grid cell. We then repeat this command line for the 30th, 40th, 50th percentile modifying the command line accordingly. You must use this version of lasthin that will part of a future LAStools release as options ‘-ignore_first_of_many’ and ‘-ignore_intermediate’ were just added this weekend.

lasthin -i crossfire.laz ^
        -ignore_first_of_many -ignore_intermediate ^
        -step 1 ^
        -percentile 20 15 ^
        -classify_as 8 ^
        -odix _p20 -olaz

Below you see the resulting subset of points marked with the temporary classification code 8 for the four different percentiles 20th, 30th, 40th, and 50th triangulated into a surface and hill-shaded.

Next we reclassify only those points marked with the temporary classification code 8 into ground (2) and unclassified (1) points using lasground by ignoring all points that still have the original classification code 0.

lasground -i crossfire_p20.laz ^
          -ignore_class 0 ^
          -wilderness ^
          -odix g -olaz

Below you see the resulting ground points computed from the subsets of points at four different percentiles 20th, 30th, 40th, and 50th triangulated into a surface and hill-shaded.

Both the ground classification of the 40th and the 50th percentile look reasonable. Only a few down spikes remain in the 40th percentile surface and a few additional bumps appear in the 50th percentile surface. Next we use lasheight with those two reasonable-looking ground surfaces to classify all points that are 20 centimeter below the triangulated ground surface into the noise classification code 7.

lasheight -i crossfire_p40g.laz ^
          -classify_below -0.2 7 ^
          -do_not_store_in_user_data ^
          -odix h -olaz

Now that the low noise points were removed (or rather classified as noise) we start the actual ground classification process. In this example we want to create a 50 cm DTM, hence it is more than sufficient to find one ground point per 25 cm cell. Therefore we first move all lowest non-noise last return per 25 cm cell to the temporary classification code 8.

Side note: One might also consider to modify the following workflow to run the ground classification on more than just the last returns by omitting ‘-ignore_first_of_many’ and ‘-ignore_intermediate’ from the lasthin call and by adding ‘-all_returns’ to the lasground call. Why? Because for all laser shots that resulted in a low noise point, this noise point will usually be the last return, so that the true ground hit could be the second to last return.

lasthin -i crossfire_p40gh.laz ^
        -ignore_first_of_many -ignore_intermediate ^
        -ignore_class 7 ^
        -step 0.25 ^
        -lowest ^
        -classify_as 8 ^
        -odix _low25 -olaz

The final ground classification is obtained by running lasground only on the points with temporary classification code 8 by ignoring all others, namely the noise points (7) and the unclassified points (0 and 1).

lasground -i crossfire_p40gh_low25.laz ^
          -ignore_class 0 1 7 ^
          -wilderness ^
          -odix g -olaz

We then use las2dem to create the 50 cm DTM from the points classified as ground. We store this DTM raster to the LAZ format which has shown to be the most efficient format for storing elevation or height rasters. We have started calling this format RasterLAZ. It is supported by all LAStools and the new DEMzip tool. One advantage is that we can feed RasterLAZ directly back into LAStools, for example as done below, for a second call to las2dem that computes a hill-shaded DTM.

las2dem -i crossfire_p40gh_low25g.laz ^
        -keep_class 2 ^
        -step 0.5 ^
        -ocut 9 -odix _dtm50 -olaz

las2dem -i crossfire_p40_dtm50.laz ^
        -step 0.5 ^
        -hillshade ^
        -odix _hill -opng

Below the resulting hill-shaded DTMs computed for the 40th and the 50th elevation percentile – as well as for the 45th elevation percentile that we’ve added for comparison.

Below we finally take a closer look at an example 1 meter profile line through the LiDAR classified by the 45th percentile workflow. There is a small stretch of ground points that was incorrectly classified as noise points (find the mouse cursor) so it might be worthwhile to change parameters slightly to make the noise classification less aggressive.

Side note follow-up: The return coloring shows there are indeed some ‘intermediate’ as well some ‘first of many returns’ just where we expect the bare terrain to be. However, there are not so many that the results can be expected to drastically change by including them into the ground finding process.