-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
brainstorming querying alternatives #43
Comments
An interesting and useful problem to solve! The algorithm sounds like it should work fairly well. |
Made a python program for preparing the data: import fiona
from shapely.geometry import shape
from collections import OrderedDict
import pandas as pd
import json
shapefile = "shapefiles/india_pc_2019.shp"
DEV = False
metaCollector = []
counter = 0
# iteration statement from https://gis.stackexchange.com/a/120574/44746
for pc in fiona.open(shapefile,'r'):
a = pc['properties'].copy()
b = shape(pc['geometry']).bounds
# from https://gis.stackexchange.com/a/90556/44746
# Returns a tuple of 4 like: (91.14863890266327, 23.06298596293192, 91.73082235147041, 24.111512008872182)
# importing the bounds into the metadata
boundLabels = ['min_lon','min_lat','max_lon','max_lat']
for x in range(4):
a[boundLabels[x]] = b[x]
metaCollector.append(a)
# they're natively geojsons! Just dump'em!
json.dump(pc, open("shapefiles/PCs/{}_{}.geojson".format(a['ST_CODE'],a['PC_CODE']), 'w'), indent=2)
counter += 1
print("{}: {}({}) / {}({})".format(counter,a['ST_NAME'],a['ST_CODE'],a['PC_NAME'],a['PC_CODE']))
if DEV and (counter > 10): break
df = pd.DataFrame(metaCollector)
df.to_csv('pc_metadata.csv',index_label='sr') |
Theres an alternative approach by encoding the vectors as a bitmap and using that as a lookup topojson/topojson#311 (comment) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I'm confident that the present mapbox querying solution will still work out fine; but just for some nerd fun and potential future use I want to work out decentralized alternatives for the core problem statement of this project : If you throw up any lat-long, how to figure out which polygon it is in, from a vast number of polygons with complex and heavy data, with a limit on amount of bytes that can be loaded for the job?
Here's one strategy:
To prep up the data for this kind of thing, here's what would be involved:
Depending on how we do this, the splitting could happen first or last.
Benefits of this strategy :
The text was updated successfully, but these errors were encountered: