Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feature request: demystify point coordinates #253

Open
metal4lyf opened this issue Oct 18, 2024 · 6 comments
Open

feature request: demystify point coordinates #253

metal4lyf opened this issue Oct 18, 2024 · 6 comments
Labels

Comments

@metal4lyf
Copy link

I am working with geo LiDAR sets in LAS format, and my simple use case is that I want to easily georeference a selected point.

It took me forever to realize that Displaz is converting the source x,y coordinates to UTM meters.

This conversion makes sense for calculating vectors in meters. However, in order for me to figure out where the point is on the earth's surface, I have to know the UTM zone and then perform a conversion to a common coordinate system. It would be great if there were an option to show the point's lat/lon in a configurable notation (decimal degrees, UTM, etc), where applicable.

@nigels-com
Copy link
Collaborator

Geo-referencing is a fairly deep topic. By the sound of it your data is in a known UTM zone (or similar) and you'd like to click on points and log those as WGS84 lat/lon? The EPSG code or WKT is available?

I was looking just now at some RTK geo-referenced cloud in displaz. The z offset handling seems kinda wonky (IMHO) but that's easily fixed/changed. Potentially there is a problem if the offset is not chosen carefully by the LAZ writer, but I think an actual fix for that would be somewhat complicated.

Have you tried using QGIS for georeferenced point cloud? It's pretty nice with the recent addition of COPC indexing.

@metal4lyf
Copy link
Author

Yes, in this case the source data is Nad83/Conus Albers (EPSG 5070). This is parseable from the LAS header/metadata. My assumption was Displaz is already reading the source CRS and doing a transformation to UTM meters, since the point X,Y logged from this particular file (https://rockyweb.usgs.gov/vdelivery/Datasets/Staged/Elevation/LPC/Projects/CO_SanLuisJuanMiguel_2020_D20/CO_SanLuisJuanMiguel_4_2020/LAZ/USGS_LPC_CO_SanLuisJuanMiguel_2020_D20_13S_BC_5409.laz) are in UTM meters.

However, I've noticed the point X,Y logged from other files in the same LiDAR collection project (https://rockyweb.usgs.gov/vdelivery/Datasets/Staged/Elevation/LPC/Projects/CO_Southwest_NRCS_2018_D18/CO_Southwest_NRCS_B2_2018/LAZ/USGS_LPC_CO_Southwest_NRCS_2018_D18_w1014n1682.laz) are not in a CRS that I recognize offhand.

In any case the source CRS should be found in the LAS header/metadata. I understand it may be defined in various parts of the LAS header/metadata, so implementation would need to consider at least a few scenarios.

QGIS seems to have comprehensive tools for this type of exploration, but its enormous installation, horrendous UI, and inability to render the least bit of 3D without consuming every resource on my system are what drove me to look for alternatives. Displaz is hands-down the better tool for my use case (determining the clean prominence of mountain peaks by identifying the summit/saddle points).

@nigels-com
Copy link
Collaborator

As far I know displaz is completely ignoring the CRS information so what it's displaying should match the raw numbers in whatever projection the data is in. In the case of EPSG:5070 it's meters with origin at latitude 23, longitude -96?

The PROJ packages includes a cs2cs CLI tool for transforming between coordinate systems. It just needs the two relevant EPSG codes.

Including PROJ, the EPSG database and the grid-shift files would be turning displaz into something more like QGIS. It's doable in principle, but it would not be cheap in terms of time or money.

@metal4lyf
Copy link
Author

Yes, maybe there is a problem with the packaging of the first dataset then.

Understandable--and if the required tools are not small, then it wouldn't make sense to grow the size of this application to the extent needed to support this feature.

What might be incredibly useful instead is a summary of the LAS header fields/metadata that were read (in the log) after a file is loaded. Just knowing the CRS without having to inspect the LAS with additional utilities would significantly speed up the process of converting the raw values.

@nigels-com
Copy link
Collaborator

So for that first LAZ file I pick a point at 254829.138 4209919.072 4314.314

lasinfo has this to say about the projection:

extended variable length header record 1 of 2:
  reserved             0
  user ID              'LASF_Projection'
  record ID            2112
  length after header  1022
  description          'OGC COORDINATE SYSTEM WKT'
    OGC COORDINATE SYSTEM WKT:
    COMPD_CS["NAD83(2011) / UTM zone 13N + NAVD88 height - US Geoid Model of 2018",PROJCS["NAD83(2011) / UTM zone 13N",GEOGCS["NAD83(2011)",DATUM["NAD83 (National Spatial Reference System 2011)",SPHEROID["GRS 1980",6378137,298.257222101,AUTHORITY["EPSG","7019"]],AUTHORITY["EPSG","1116"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["Degree",0.0174532925199433,AUTHORITY["EPSG","9102"]],AXIS["Geodetic longitude",EAST],AXIS["Geodetic latitude",NORTH],AUTHORITY["EPSG","6318"]],PROJECTION["Transverse_Mercator"],PARAMETER["central_meridian",-105],PARAMETER["latitude_of_origin",0],PARAMETER["scale_factor",0.9996],PARAMETER["false_easting",500000],PARAMETER["false_northing",0],UNIT["Meter",1,AUTHORITY["EPSG","9001"]],AXIS["Easting",EAST],AXIS["Northing",NORTH],AUTHORITY["EPSG","6342"]],VERT_CS["NAVD88 height - US Geoid Model of 2018",VERT_DATUM["North American Vertical Datum 1988",2005,AUTHORITY["EPSG","5103"]],UNIT["Meter",1,AUTHORITY["EPSG","9001"]],AXIS["Gravity-related height",UP],AUTHORITY["EPSG","5703"]]]

It's EPSG:6342, rather than EPSG:5070 perhaps?
(I'm not so familiar with North American projections.)

@metal4lyf
Copy link
Author

With laspy reading the metadata I was getting 5070. Likely I was mistakenly reading one of the other project datasets while viewing the one which was already in UTM 13N. That makes a lot of sense, as the 2nd file is definitely not in UTM.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants