-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
workflow recommendations for simple spatial mean computations #687
Comments
Have you tried
|
Maybe it will be even better to use the Edit: I checked the source of the |
No luck with aggregate - similar behaviour to st_apply - slowly consumes RAM/virtual memory until crashing. I'll look into terra and exact_extract |
terra::extract had the same ram problem as the stars approach:
raster::extract didn't appear to have a RAM problem, but took a really long time - I gave up after over an hour.
|
If you are interested in the highest efficiency, I think the best option will be to use |
Hi,
I have geotif data, where each tile is 109407 x 69515. The data is simple, single band. I want to compute means over polygons (statistical areas from another source). Some polygons overlap tile boundaries, but I'll deal with that later.
I was hoping I could use st_apply to load and process a single row or column of data at a time - i.e counting the number of non missing pixels (i.e. those in the polygon) and their sum, allowing the overall mean to be computed by combining the line data.
However, I don't seem to be able to do any polygon related operations without triggering a load operation that exhausts memory. Subsampling the data will almost certainly be good enough for this project, but I'm obsessing about getting an efficient, exact solution.
The following appears to work in a reasonable time and memory footprint
but as soon as I include a polygon-based indexing operation, RAM usage goes crazy:
Any suggestions?
The text was updated successfully, but these errors were encountered: