Pre-Processing Mobile Rail LiDAR with LAStools

The majority of LAStools users are processing airborne LiDAR. That should not surprise as airborne is by far the most common form of LiDAR in terms of square kilometers covered. The availability of LiDAR as “open data” is also pretty much restricted to airborne surveys, which are often tax-payer funded and then distributed freely to achieve maximum return of investment.

But folks are increasingly using our software to do some of the “heavy lifting” for mobile LiDAR, either mounted on a truck for scanning cities or on a train for capturing railroad infrastructure. The LiDAR collected for the cities of Budapest and Singapore, for example, was pre-processed by multi-core scripted LAStools when the scanning trucks returned with their daily trajectories worth of point clouds captured by a RIEGL VMX-450 mobile mapping system.

One customer who was recently scanning railroad infrastructure wanted to do automatic ground classification as a first step prior to further segmentation of the data. We were asked for advice because on such data the standard settings of lasground left too many patches of ground unclassified. Also the uniform tiling lastile generates by default is not a good way to break such data into manageable pieces given the drastically varying point densities in mobile scanning.

We obtained a 217 MB file in LAZ format with 40 million points corresponding to a 2.7 km stretch of railway track. We first run a quick lasindex (with the options for ‘mobile’) on the file that creates a spatial indexing LAX file with maximally 10 meter resolution. This not only allows faster area-of-interest queries but also gives us a more detailed preview than just the bounding box of where the LiDAR points actually are in the GUI of LAStools.

mobile_rail_lidar_01

Presence of LAX files results in actual extend of LiDAR being shown in GUI.

lasindex -i segment.laz -tile_size 10 -maximum -100

We then run lastile four times to create an adaptive tiling in which no tile has more than 6 million points. The first call creates the initial 1000 by 1000 meter tiles. The following three calls refine all those tiles that still have more than 6 million points first into 500 by 500 meter, then 250 by 250 meter, and finally 125 by 125 meter tiles in parallel on 4 cores. Note the ‘-refine_tiling’ option is used in the first call to lastile and the ‘-refine_tiles’ option in all subsequent calls.

lastile -i segment.laz ^
        -tile_size 1000 ^
        -buffer 10 -flag_as_withheld ^
        -refine_tiling 6000000 ^
        -odir tiles_raw -o rail.laz
lastile -i tiles_raw\*_1000.laz ^
        -flag_as_withheld ^
        -refine_tiles 6000000 ^
        -olaz ^
        -cores 4
lastile -i tiles_raw\*_500.laz ^
        -flag_as_withheld ^
        -refine_tiles 6000000 ^
        -olaz ^
        -cores 4
lastile -i tiles_raw\*_250.laz ^
        -flag_as_withheld ^
        -refine_tiles 6000000 ^
        -olaz ^
        -cores 4

The resulting tiles all have fewer than 6 million points but still have the initial 10 meter buffer that was specified by the first call to lastile. Two tiles were sufficiently small after the 1st call, three tiles after the 2nd call, eleven tiles after 3rd call, and three tiles after the 4th.

contents of tile shown in blue in adaptive tiling below

points of adaptive tile (high-lighted in blue below) colored by intensity

Adaptive tiling created with four calls to lastile.

Adaptive tiling created with four calls to lastile. Scale factors of 0.00025 (see mouse cursor) implies that point coordinates are stored with quarter millimeter resolution. Lowering them to 0.001 would result in better compression and lower I/O.

Noise in the data – especially low noise – can lead lasground into choosing the wrong points during ground classification by latching on to those low noise points. We first classify the noise points into a different class (7) using lasnoise so we can later ignore them. These particular settings were found by experimenting on a few tiles with different values (see the README file) until visual inspection showed that most low points had been classified as noise.

lasnoise -i tiles_raw\*.laz ^
         -step_xy 0.5 -step_z 0.1 ^
         -odir tiles_denoised -olaz ^
         -cores 4
noise points shown in violett

noise points shown in violett

The points classified as noise will not be considered as ground points during the next step. For this it matters little that lamp posts, wires, or vegetation are wrongly marked as noise now. We can always undo their noise classification once the ground points were classified. Important is that those pointed to by the mouse cursor, which are below the desired ground, are excluded from consideration during the ground classification step. Here those low points are not actually noise but returns generated wherever the laser was able to “peek” through an opening to a lower surface.

lasground -i tiles_denoised\*.laz ^
          -ignore_class 7 ^
          -step 1 -sub 3 -bulge 0.1 -spike 0.1 -offset 0.02 ^
          -odir tiles_ground -olaz ^
          -cores 4

For classification with lasground there are a number of options to play with  (see the README file) but the most important is the correct step size. It is terrain along the railway track bed that is supposed to get represented well. The usual step of 5 to 40 meter for lasground aim at the removal of vegetation and man-made structures from airborne LiDAR. They are not the right choice here. A step of 1 and the parameters shown above gives us the ground shown below.

Classification of terrain along railway track using lasground with '-step 1'

Classification of terrain along railway track bed using lasground with ‘-step 1’

The new ‘-flag_as_withheld’ option in lastile that flags each point in the buffer with the withheld flag is useful in case we want to remove all buffer points on-the-fly, for example, in order to create a DTM hillshade of 25 cm resolution for a visual quality check of the entire 2.7 km track using blast2dem from the BLAST extension of LAStools.

blast2dem -i tiles_ground\*.laz -merged ^
          -drop_withheld -keep_class 2 ^
          -hillshade -step 0.25 ^
          -o dtm_hillshaded.png
Small 600 x 600 pixel detail of hill-shaded 5663 x 9619 pixel DTM raster generated by blast2dem

Small 600 x 600 pixel detail of hill-shaded 5663 x 9619 pixel DTM raster generated by blast2dem.

The dArc Force Awakens: ESRI escalates LiDAR format war

The empire has not changed their evil ways, despite an encouraging email from ESRI’s founder and president Jack Dangermond in response to the Open Letter by the OSGeo that was delivered to ESRI, OGC, and the ASPRS. Facing an incredible backlash by the LiDAR community over the release of their “LAZ clone” there was a new hope that unnecessary format fragmention could be avoided by working together within the Point Cloud Domain Working Group of the OGC. In fact only one thing happened: ESRI went silent on the controversy. They temporarily stopped promoting their “LAZ clone” and focused on locking in more content.

dArc_force_awakens

The message of the rebellion has been consistent and clear like in these two videos from the TC meeting of the OGC in Nottingham and the ASPRS side bar in Reno: a roadmap forward to avoid format fragmentation by exploiting the “natural break” in the format due to LAS 1.4. But there was zero technical contribution from ESRI during the past three PC-DWG meetings of the OGC. The slide sets that bored the audiences in Boulder and in Nottingham were not meant to contribute but merely stalled for time. Recently in Sydney ESRI was awefully quiet, knowing they were doing the exact opposite of what the OGC stands for. And now the empire strikes back.

laztozlas

There is a dArc force awakening that threatens the peace within the LiDAR community. ESRI has just released a new tool (see above) that enslaves point clouds by converting them from the open LAZ format to the near-identical but closed “LAZ clone” that they call “zLAS” or “Optimized LAS”. This comes just a few months after an entire nation‘s LiDAR was enslaved in this proprietary format. We have repeatedly warned about the ramifications of locking up Petabytes of LiDAR data in a closed format that is controlled by a single vendor.

ESRI is one of the largest GIS training organizations. By instructing LiDAR novices to “optimize” their LiDAR files and pushing LiDAR providers to switch from open LAS or open LAZ to closed zLAS, they effectively destroy the current success of our open formats. ESRI’s command of the GIS market can – little by little – turn their own proprietry format into the dominant way in which LiDAR point clouds are stored. Then we loose our open exchange formats. Hence, ESRI’s proprietary format threatens all that we have achieved with LAS (and LAZ) over the past years: compatible LiDAR data exchange and incredible LiDAR software interoperability.

ESRI is now escalating the LiDAR format wars. Join the rebellion, Jedis: download your lazer sabers and liberate some LiDAR.

This is not an anti-ESRI campaign. For the past three years we have been trying to resolve this situation. We have repeatedly reached out to ESRI to prevent format fragmentation. We have repeatedly offered to create a joint compressed format. We have plead, begged, and bargained for the sake of our LiDAR community and the sake of their ArcGIS user community not to promote a near-identical yet incompatible way for storing massive amounts of point cloud data.

RIEGL Becomes LASzip Sponsor for LAS 1.4 Extension

PRESS RELEASE (for immediate release)
August 31, 2015
rapidlasso GmbH, Gilching, Germany

We are happy to announce that RIEGL Laser Measurement Systems, Austria has become a sponsor of the award-winning LASzip compressor. Their contribution at the Silver level will kick-off the actual development phase of the “native LAS 1.4 extension” that had been discussed with the LiDAR community over the past two years. This “native extension” for LAS 1.4 complements the existing “compatibility mode” for LAS 1.4 that was supported by Gold sponsor NOAA and Bronze sponsors Quantum Spatial and Trimble Geospatial. The original sponsor who initiated and financed the open sourcing of the LASzip compressor was USACE – the US Army Corps of Engineers (see http://laszip.org).

The existing “LAS 1.4 compatibility mode” in LASzip was created to provide immediate support for compressing the new LAS 1.4 point types by rewriting them as old point types and storing their new information as “Extra Bytes”. As an added side-benefit this has allowed legacy software without LAS 1.4 support to readily read these newer LAS files as most of the important fields of the new point types 6 to 10 can be mapped to fields of the older point types 1, 3, or 5.

In contrast, the new “native LAS 1.4 extension” of LASzip that is now sponsored in part by RIEGL will utilize the “natural break” in the format due to the new point types of LAS 1.4 to introduce entirely new features such as “selective decompression”, “rewritable classifications and flags”, “integrated spatial indexing”, … and other functionality that has been brain-stormed with the community since rapidlasso GmbH had issued the open “call for input” on native LASzip compression for LAS 1.4 in January 2014. We invite you to follow the progress or contribute to the development via the discussions in the “LAS room“.

silverLASzip_m60_512_275

About rapidlasso GmbH:
Technology powerhouse rapidlasso GmbH specializes in efficient LiDAR processing tools that are widely known for their high productivity. They combine robust algorithms with efficient I/O and clever memory management to achieve high throughput for data sets containing billions of points. The company’s flagship product – the LAStools software suite – has deep market penetration and is heavily used in industry, government agencies, research labs, and educational institutions. Visit http://rapidlasso.com for more information.

About RIEGL:
Austrian based RIEGL Laser Measurement Systems is a performance leader in research, development and production of terrestrial, industrial, mobile, bathymetric, airborne and UAS-based laser scanning systems. RIEGL’s innovative hard- and software provides powerful solutions for nearly all imaginable fields of application. Worldwide sales, training, support and services are delivered from RIEGL‘s Austrian headquarters and its offices in Vienna, Salzburg, and Styria, main offices in the USA, Japan, and in China, and by a worldwide network of representatives covering Europe, North and South America, Asia, Australia and Africa. Visit http://riegl.com for more information.

Use Buffers when Processing LiDAR in Tiles !!!

We often process LiDAR in tiles for two reasons: first, to keep the number of points per file low and use main memory efficient, and second, to speed up the computation with parallel tile processing and keep all cores of a modern CPU busy. However, it is very (!!!) important to take the necessary precautions to avoid “edge artifacts” when processing LiDAR in tiles. We have to include points from neighboring tiles during certain LAStools processing steps to avoid edge artifacts. Why? Here is an illustration from our PHIL LiDAR tour earlier this year:

Buffers are important to avoid edge artifacts along tile boundaries during DTM creation.

Buffers are important to avoid edge artifacts along tile boundaries.

What you see is the temporary TIN of ground points created (internally) by las2dem or blast2dem that is then rastered at the user-specified step size onto a grid. Without a buffer (right side) there will not always be a triangle to cover every pixel. Especially in the corners of the tile you will often find empty pixels. Furthermore the poorly shaped “sliver triangles” along the boundary of the TIN do not interpolate the ground elevations properly. In contrast, with buffer (left side) the TIN generously covers the entire area that is to be rastered with nicely shaped triangles.

The

Christmas cookie analogy: buffers are like generously rolling out the dough

Here the christmas cookies analogy: You need to roll out the dough larger than the cookies you will cut to make sure your cookies will have nice edges. Think of the TIN as the dough and the square tile as your cookie cutter. You need to use a sufficiently large buffer when you roll out your TIN to assure an edge without crumbles when you cut out the tile … (-: … otherwise you are pretty much guaranteed to get results that – upon closer inspection – have these kind of artifacts:

Without buffers processing artifacts also happen when classifying points with lasground or lasclassify, when calculating height above ground or height-normalizing LiDAR tiles with lasheight, when removing noise with lasnoise, when creationg contours with las2iso or blast2iso, or any other operation where an incomplete neighborhood of points can affect the results. Hence, we need to surround each tile with a temporary buffer of points. Currently there are two ways of working with buffers with LAStools:

  1. creating buffered tiles during the initial tiling step with the ‘-buffer 25’ option of lastile, maintaining buffered tiles throughout processing and finally using the ‘-use_tile_bb’ option of lasgrid, las2dem, blast2dem, or lascanopy to raster the tiles without the temporary buffer.
  2. creating buffered tiles from non-overlapping (= unbuffered) tiles with “on-the-fly” buffering using the ‘-buffered 25’ option of most LAStools such as lasground, lasheight, or las2dem. For some workflows it is useful to also add ‘-remain_buffered’ if buffers are needed again in the next step. Finally, we use the ‘-use_orig_bb’ option of lasgrid, las2dem, blast2dem, or lascanopy to raster the tiles without the temporary buffer.

In the following three (tiny) examples using the venerable ‘fusa.laz’ sample that is distributed with LAStools to illustrate the two types of buffering as well as to show what happens when no buffers are used. In each example we will first cut the small ‘fusa.laz’ sample into nine smaller tiles and then process these separately on 4 cores in parallel.

1. Initial buffer creation with lastile

This is what most of my tutorials teach. It assumes you are the one creating the tiling in the first place. If you do it with lastile and add a buffer right from the start things are pretty easy.

lastile -i ..\data\fusa.laz ^
        -set_classification 0 -set_user_data 0 ^
        -tile_size 100 -buffer 20 ^
        -odir 1_raw -o futi.laz

We cut the input into 100 meter by 100 meter tiles but add a 20 meter buffer around each tile. That means that each tile on disk will contain the points for an area of up to 140 meter by 140 meter. The GUI for LAStools shows the overlap and if you scrutinize the bounding box values that the cursor points to you notice the extra 20 meters in each direction.

tiles_buffered_with_lastile

Now we can forget about the buffers and run the standard workflow consiting of lasground, lasheight, and lasclassify to distinguish ground, vegetation, and building points in the LiDAR tiles.

lasground -i 1_raw\futi*.laz ^
          -city ^
          -odir 1_ground -olaz ^
          -cores 4
lasheight -i 1_ground\futi*.laz ^
          -drop_above 50 ^
          -odir 1_height -olaz ^
          -cores 4
lasclassify -i 1_height\futi*.laz ^
            -odir 1_classify -olaz ^
            -cores 4

At the end – when we generate raster products – we have to remember that the tiles were buffered by lastile and cut off the buffers when we raster the TIN with option ‘-use_tile_bb’ of las2dem.

las2dem -i 1_classify\futi*.laz ^
        -keep_class 2 6 ^
        -step 0.25 -use_tile_bb ^
        -odir 1_dbm -obil ^
        -cores 4

We created a digital terrain model with buildings (DBM) by keeping the points with classification 2 (ground) and 6 (building). After loading the resulting 9 tiles into QGIS and generating a virtual raster we see a nice seamless DBM without any edge artifacts.

The DEM of the 9 tiles computed with buffers created by lastile has no edge artifacts acoss tile boundaries.

The DBM of the 9 tiles computed with buffers created by lastile has no edge artifacts acoss tile boundaries.

If you need to deliver the LiDAR files you should remove the buffers with lastile and option ‘-remove_buffer’.

lastile -i 1_classify\futi*.laz ^
        -remove_buffer ^
        -odir 1_final -olaz ^
        -cores 4

2. On-the-fly buffering

Now assume you are given LiDAR tiles without buffers. We generate them here with lastile.

lastile -i ..\data\fusa.laz ^
        -set_classification 0 -set_user_data 0 ^
        -tile_size 100 ^
        -odir 2_raw -o futi.laz

The only difference is that we do not request the 20 meter buffer and the result is a typical tiling as you may receive it from a vendor or download it from a LiDAR portal. The GUI for LAStools shows that there is no overlap and if you scrutinize the bounding box values that the cursor points to, you see that the tiles is exactly 100 meters bty 100 meters.

tiles_without_buffer

Now we have to think about buffers a lot. When using on-the-fly buffering we should first spatially index the tiles with lasindex for faster access to the points from neighbouring tiles.

lasindex -i 1_raw\futi*.laz -cores 4

Below in red are the modifications for on-the-fly buffering to the standard workflow of lasground, lasheight, and lasclassify. The first lasground run uses ‘-buffered 20’ to add buffers to each tile and ‘-remain_buffered’ to write those buffers to disk. This way they do not have to created again by lasheight and lasclassify.

lasground -i 2_raw\futi*.laz ^
          -buffered 20 -remain_buffered ^
          -city ^
          -odir 2_ground -olaz ^
          -cores 4
lasheight -i 2_ground\futi*.laz ^
          -remain_buffered ^
          -drop_above 50 ^
          -odir 2_height -olaz ^
          -cores 4
lasclassify -i 2_height\futi*.laz ^
            -remain_buffered ^
            -odir 2_classify -olaz ^
            -cores 4

At the end we have to remember that the tiles still have on-the-fly buffers and them cut off with option ‘-use_orig_bb’ of las2dem.

las2dem -i 2_classify\futi*.laz ^
        -keep_class 2 6 ^
        -step 0.25 -use_orig_bb ^
        -odir 2_dbm -obil ^
        -cores 4

Again, we created a digital terrain model with buildings (DBM) by keeping the points with classification 2 (ground) and 6 (building). The resulting hillshade computed from a virtual raster that combines the 9 BIL rastera into one looks perfectly smooth in QGIS.

The hillshaded DEM of the 9 tiles computed with on-the-fly buffering has no edge artifacts acoss tile boundaries.

The hillshaded DBM of 9 tiles computed with on-the-fly buffering has no edge artifacts acoss tile boundaries.

If you need to deliver the LiDAR files you should probably remove the buffers first … but that is not yet implemented. (-:

lastile -i 2_classify\futi*.laz ^
        -remove_buffer ^
        -odir 2_final -olaz ^
        -cores 4

3. Bad: No buffering

Here what you are *not* supposed to do. Assuming you get unbuffered tiles.

lastile -i ..\data\fusa.laz ^
        -set_classification 0 -set_user_data 0 ^
        -tile_size 100 ^
        -odir 3_raw -o futi.laz

Bad. You do not take care about buffering when processing the tiles.

lasground -i 3_raw\futi*.laz ^
          -city ^
          -odir 3_ground -olaz ^
          -cores 4
lasheight -i 3_ground\futi*.laz ^
          -drop_above 50 ^
          -odir 3_height -olaz ^
          -cores 4
lasclassify -i 3_height\futi*.laz ^
            -odir 3_classify -olaz ^
            -cores 4

Bad. You do not take care about buffering when generating the DBM.

las2dem -i 3_classify\futi*.laz ^
        -keep_class 2 6 ^
        -step 0.25 ^
        -odir 3_dbm -obil ^
        -cores 4

Bad. You get crappy results with edge artifacts clearly visible in the hillshade.

The hillshaded DBM of 9 tiles computed WITHOUT using buffers has severe edge artifacts acoss tile boundaries.

The hillshaded DBM of 9 tiles computed WITHOUT using buffers has severe edge artifacts acoss tile boundaries.

Bad. If you zoom in on a corner where 4 tiles meet you find missing pixels and incorrect elevation values. Bad. Bad. Bad. So please folks. Try this on your own data. Notice the horrible edge artifacts. Then always use buffers … (-:

PS: Usually no buffers are needed for running lasgrid, lasoverlap, or lascanopy as they perform simple binning operations that do not make use of neighbour information.

Five Myths about LAS, LAZ, and “Optimized LAS”

The Open Letter by OSGeo was delivered to ESRI, OGC, and the ASPRS last week and the initial reponses – including an email from ESRI’s founder and president Jack Dangermond – are very encouraging. Attendees of last weeks’ ASPRS conference were discussing how to respond to ESRI’s proprietary “Optimized LAS” that threatens the achievements of the open LiDAR formats LAS and LAZ that the community has been using for many years now. Below five clarifications to five wrong statements overheard at these meetings:

1) Martin’s “LAZ” format is also proprietary.

Wrong. LAZ – just like LAS – is an open format. LAZ is defined by a well commented open reference implementation in C/C++ and described in a PE&RS paper published in February 2013. LAS is defined via a specification document but has no reference implementation. Both can be freely used by anyone and (re-)implemented on any operating system and in any programming language. For example, there is now a javascript version of LAZ that someone else created.

2) We have no argument because ESRI provides a free API for “Optimized LAS”.

Wrong. “Optimized LAS” can only be used via the mechanism, the programming language, and the operating system of ESRI’s choosing. This is the very definition of “proprietary format”. Here is what Wikipedia says:

A proprietary format is a file format of a company, organization, or individual that contains data that is ordered and stored according to a particular encoding-scheme, designed by the company or organization to be secret, such that the decoding and interpretation of this stored data is only easily accomplished with particular software or hardware that the company itself has developed. The specification of the data encoding format is not released, or underlies non-disclosure agreements.

In contrast an open format is a file format that is published and free to be used by everybody.

3) Martin’s “LAZ” format is only used by LAStools.

Wrong. Large parts of the LiDAR industry embrace LAZ and have added read & write support for the LAZ format using the open source code or the DLL. Examples are QT Modeler, Globalmapper, FME, Fugroviewer, ERDAS IMAGINE, ENVI LiDAR, Bentley Pointools, TopoDOT, FUSION, CloudCompare, Gexel R3, Pointfuse, …and many more. Notable exceptions are ArcGIS and the product line offered by Lewis Graham’s GeoCue group. We maintain an (incomplete) list of software with native LAZ support here.

4) ESRI has engineered “Optimized LAS” for the cloud and “LAZ” cannot compete.

Wrong. The extra functionality in “Optimized LAS” is a simple mash-up of LAZ with spatial indexing LAX, an optional spatial sort, and a few extra statistics. This is why ESRI’s format is also known as the “LAZ clone”. We were able to feature-match these minor engineering changes in an afternoon which – a few days later – resulted in this April Fools’ Day prank. In fact, LAZ has been used “in the cloud” for well over 4 years on OpenTopography – the first and probably the premier Web accessible LiDAR cloud service of our industry. It is also used by many other LiDAR download servers. We maintain an (incomplete) list of portals offering compressed LAZ here.

5) ESRI’s “Optimized LAS” does not prevent people from using LAS.

ESRI is one of the largest GIS training organizations. If they teach hundreds of LiDAR novices to “optimize” their “unoptimized LAS” files while simultaneously lobbying large LiDAR providers into switching from LAS or LAZ to zLAS they will effectively destroy the current success of our open formats. ESRI’s command of the GIS market can – little by little – turn their own proprietry format into the dominant way in which LiDAR point clouds are exchanged. Then we loose our open exchange formats. Hence, ESRI’s proprietary “Optimized LAS” format “threatens” what we have achieved with LAS (and LAZ): open LiDAR data exchange and incredible LiDAR software interoperability.

This is not an anti-ESRI campaign. We hope to work with ESRI to resolve this situation. Below an image and a quote from ESRI’s ArcNews Spring 2011 news letter about the importance of open formats, standards, and specifications …

ESRI: "Esri continues to advocate the need for open access to geographic data and functionality through support for widely adopted and practical standards and specifications. Esri follows an open system strategy for accessing and using geographic data and functionality."

“Esri continues to advocate the need for open access to geographic data and functionality through support for widely adopted and practical standards and specifications. Esri follows an open system strategy for accessing and using geographic data and functionality.” — ArcNews, Spring 2011

New LASliberator “frees” LiDAR from Closed Format

PRESS RELEASE (for immediate release)
April 20, 2015
rapidlasso GmbH, Gilching, Germany

The latest product by rapidlasso GmbH – creators of LAStools and LASzip – is an open source tool aiming to liberate LiDAR points locked-up in proprietary “Optimized LAS” – a highly controversial, closed LiDAR format. The new LASliberator can be downloaded here. It comes as both, a simple command line tool for scripting and with an easy-to-use graphical interface.

The GUI version of the LASliberaor has a simple and easy-to-use interface.

The GUI of the “LASliberator” has a simple, easy-to-use interface.

The LASliberator reads LiDAR points from closed “Optimized LAS” files that use the “.zlas” extension and converts them to open ASPRS LAS files that use the “.las” extension. Alternatively, the points can be stored to compressed LAZ files – using the open source LASzip compressor – that use the “.laz” extension. In addition, the tool creates tiny spatial indexing files that use the “.lax” extension. These can then be exploited for accelerated area-of-interest queries via open source LASindex when using LAStools or the latest version of the LASzip DLL.

Note that the LASliberator cannot entirely be open source as it depends on a particular proprietry library. The closed nature of the “Optimized LAS” format does not allow for a full open source implementation. It is therefore not possible to port the LASliberator to other operating systems or into other programming languages.

Selecing open in the GUI pops up a file selection dialogue allowing the user to find the file that is to be set free.

The user can select a file to liberate by pressing “open” in the GUI.

The new LASliberator comes on the heels of an outcry in the community over the LiDAR format fragmentation “Optimized LAS” is creating. It provides an immediate solution to go from closed zLAS to open LAZ for people whose LiDAR got stuck in yet-another-proprietary-format.

About rapidlasso GmbH:
Technology powerhouse rapidlasso GmbH specializes in efficient LiDAR processing tools that are widely known for their high productivity. They combine robust algorithms with efficient I/O and clever memory management to achieve high throughput for data sets containing billions of points. The company’s flagship product – the LAStools software suite – has deep market penetration and is heavily used in industry, government agencies, research labs, and educational institutions. Visit http://rapidlasso.com for more information.

The LAS format, the ASPRS, and the “LAZ clone” by ESRI

We are concerned about ESRI’s next moves in forcing yet another proprietary format into wide-spread deployment. Forwarded emails, retold conversations, and personal experiences suggest that sneaky tactics are being used to disrupt the harmony in open LiDAR formats that we have enjoyed for many years.

laz_and_lazclone_smallSome time has passed since we broke the news about the proprietary “LAZ clone” by ESRI. We were expecting the ASPRS to eventually comment on the issue. ESRI is promoting their lock-in product by the name of the open LAS specification (for which the ASPRS holds the copyright) calling their closed format “Optimized LAS“. We have been asked (in various forums) about the position of the ASPRS on this issue. ESRI’s use of LAS (*) makes it seem as if their “LAZ clone” was somehow an ASPRS thing (as evidenced by Harold’s comment). Despite ESRI’s media-blah-blah about “open and interoperable” they are – once again – luring the geospatial community to fall for a new proprietary format. So far the ASPRS has not released a statement on ESRI’s closed version of the LAS format.

The LAS Working Group (LWG) is part of the Lidar Division of the ASPRS. It has been maintaining the evolving LAS format from its 1.0 version that was (apparently as early as 1998) created by the LiDAR industry’s pioneers and eventually donated to the ASPRS (more recent LAS history is linked here). The good and early decisions of the LWG have created an incredible successful open data exchange standard for discrete LiDAR points that is nowadays supported by practically every software. “Kudos” to the original members for this achievement.

We did not join the LWG until 2011 to help avoid broken compatibility in LAS 1.4. After weathering the following “laser storm of 2011” the working group has been rather quiet. Its most recent activity was in 2013 for tendering the development of an official ASPRS LAS Validation Suite (LVS) that eventually resulted in ‘lasvalidate‘  – an open source LAS validator.

So who is this LWG? And why are they not commenting on such an important controvery like this “LAZ clone” with the seductive name “Optimized LAS”? The latest document on the Web pages of the LAS Working Group (LWG) lists the following people as members:

asprs_lascontrovery_1_members

This list from 2011 is hopelessly out of date, but it should give you an idea of the composition of the LWG. Most likely rapidlasso is still a member of the LWG but it is hard to tell because there have not been any emails recently and because there are no regular meetings. In the past we had some real bad luck with bringing up issues directly with the LWG, so here we go:

Dear ASPRS and LWG,
we are the guardians of the open LiDAR data exchange
specification, the LAS format. What is our response
to the proprietary format called "Optimized LAS" that
is being agressively promoted by ESRI?

Dear concerned ASPRS member,
how would you like your organization to respond now
that a large geospatial company uses its dominance
to push a closed format into the market, sabotaging
the accomplishments of an open data exchange standard
maintained by the ASPRS.

We are worried that ESRI – beyond lobbying agencies to convert their current holdings to the proprietary “LAZ clone” or to tender future deliveries in the closed zLAS format – may also be trying to form strategic alliances with vendors of popular LiDAR processing packages. Many of these vendors are also members of the LAS Working Group and would be in a conflict-of-interest if they were to “sell out” to ESRI’s lock-in ambitions.

You can imagine the red flag that went up a few days ago when we saw a technical comment on a LinkedIn post by Gene Roe that suggested intimate familiarity with the capabilities of the “LAZ clone” by Lewis Graham who has been leading the LAS effort since 1998 and who is the chair of the LAS Working Group. That Lewis’ comment has since been removed did little to calm our worries. As a side note: Gene’s posts being titled “LAS Data Format” further dilutes the difference between open LAS and closed zLAS.

Please inform us (or comment below) about any lobbying you hear about. Given the agressive moves by ESRI – in face of our repeated attempts to reach out – we do not think we can afford to err on the side of caution any longer … (-;

——————-

(*) It is fair to note that our products such as LAStools, LASlib, and LASzip also use the name “LAS”. This is for historic reasons. That is what we called the simple package for reading, writing, and processing LAS files we created back in 2005 for our own research before releasing them as open source in 2007. During our postdoc years at UC Berkeley we did not anticipate that these tools would become so or that we would start a company a few years later …