Skip to content

Releases: heremaps/here-cli

1.9.0

06 Dec 11:46
Compare
Choose a tag to compare

v1.9.0 of the CLI provides support for HERE platform interactive map layers.

To access interactive map layer functions, download your HERE platform credentials (via the Administrative tab under the Access Manager) and authenticate the CLI using

here configure workspace {filepath to credentials}

Once authenticated, you will be able to see a new here interactivemap | iml command with the following options:

  • create [options] [catalogHrn] create a new interactive map layer
  • config [options] [catalogHrn] [layerId] configure/update an interactive map layer in a catalog
  • upload [options] [catalogHrn] [layerId] upload one or more GeoJSON, CSV, GPX, XLS, or a Shapefile to the given layerid
  • list|ls [options] information about available interactive map layers
  • show [options] <catalogHrn> <layerId> shows the content of the given id
  • delete [options] <catalogHrn> <layerId> delete the interactive map layer with the given id
  • clear [options] <catalogHrn> <layerId> clear data from interactive map layer
  • token get workspace token
  • help [command] display help for command

1.8.1

09 Jun 19:14
Compare
Choose a tag to compare

1.8.1 is a patch release covering a few bug fixes and optimizations:

  • We fixed issue of file name tag getting added multiple times in url in upload command which was failing large files upload with 413 errors.
  • We fixed error handling messages in upload.

1.8.0

11 May 18:56
Compare
Choose a tag to compare

Data Hub CLI Version 1.8

⬡⬡⬡ h3-assisted spatial search ⬢⬢⬢

v1.8 of the Data Hub CLI provides a scalable and iterable option for spatial search. This helps complete spatial searches on very large and/or extremely dense data sets where the results within a requested polygon are above the 100,000 feature or 20MB API gateway limit. (Think pulling in POIs for Maricopa County or Brooklyn.)

This new option divides the specified --feature geometry (usually an admin boundary, like a city or a county) into h3 hexbins of the specified resolution, and processes each hexbin as a single spatial search.

Note that the appropriate resolution to choose largely depends on the density of the dataset. While an h3 resolution of 7 often provides a reasonable number of hexbins to subdivide a typical a city or county, a particularly large and/or dense cluster of features may result in the spatial search maxing out the API gateway for some hexbins, with only a partial set of results returned. We attempt to detect such incomplete hexbins and recommend that the user select a higher h3 resolution.

While the default behavior for the h3-polyfill function is to only return hexbins whose centroids are within the polygon -- a reasonable assumption if you are dealing with multiple adjacent admin areas, but not when you want to return all features within the admin polygon -- we include any h3 hexbins that intersect with the admin polygon even if the centroids are outside. We then clip these border polygons to the admin polygon to ensure the results of the h3-assisted spatial search only includes features in the polygon of interest.

Screen Shot 2021-05-06 at 11 15 36 PM

The output can be saved using --raw or saved to a new or existing space using --targetSpace. You can also upload the results of each hexbin to a space using the --targetSpace option. (If you don't specify a space ID, we'll make one for you.)

here xyz show <spaceID> --spatial --feature space,featureID --h3 7 --targetSpace [spaceID, or not]

You can also save the h3 hexbins used in the spatial searches with --saveHexbins.

🛠🔧🗜 Fixes 🛠🔧🗜

  • we remove UUIDs when exporting via --raw
  • we no longer enable UUID by default upon creation a new space

1.7.1

18 Mar 11:06
Compare
Choose a tag to compare

1.7.1 is a patch release covering a few bug fixes and optimizations:

  • we made show -w work properly with targeted sharing
  • show -v now points to a new and exciting version of Space Invader that supports viz mode, bringing sampling and simplification and clustering options to more effectively view your giant data sets
  • we fixed a bug in the transform command

1.7.0

18 Dec 15:47
Compare
Choose a tag to compare

It's the end of the year, but never fear, v1.7 of the CLI is here.

✨🌟💫 NEW 💫🌟✨

🎯🎯🎯 Targeted Sharing 🎯🎯🎯

You can now request access to another user's space -- they send you the space ID, you use the interactive here xyz sharing command to request access, and they use it to grant you access (or not). You can also use it to list the spaces you've requested access to, the spaces you're sharing with others, as well as cancelling the request if necessary.

This method is more precise and, well, targeted, than the existing shared parameter which when set to true, allowed anyone with a HERE account to see that space.

Target Sharing can also be controlled in the Data Hub Console.

🏚🏘🏡 Delaunay neighbours 🏡🏘🏚

You can now generate a list of a point's neighbours in a space, in addition to generating the Delaunay polygons themselves.

here xyz gis spaceID --delaunay --neighbours

Note that this will add a new property, xyz_delaunay_neighbours, to the points in the source space containing an array of the neighbouring IDs.

⬣⬡⬢ Batch h3 hexbins ⬣⬡⬢

The client-side batch hexbin command can now create h3 hexbins and their centroids. (These are written to a second space, as opposed to being generated on-the-fly like Data Hub's server-side hexbins). This is useful for saving a high-resolution h3 hexbin grid of a space with a very large number of features to be viewed at low zoom levels (effectively simulating a heatmap).

here xyz hexbin spaceID --h3 --resolution | -z

You can specify one or more h3 resolution levels using either comma-separated values, or a range delineated by a dash.

For your convenience, we also let you choose a slippy map zoom level for h3 hexbin creation -- this uses the same table as Data Hub's server-side hexbins, giving you reasonably-sized h3 hexbins for a given map zoom level if you're not used to thinking in terms of h3 resolutions levels yet.

We also added the ability to generate hexbins from spaces with lines as well as points, which helps aggregate and visualize datasets with lots of short, choppy road segments.

Note that h3 hexbin density increases by a factor of 7, so you can quickly end up with… a lot of hexbins. For now you may need to give node more memory for dense, national datasets. For context, we have found that h3 resolution of 8 or 9 works well for visualizing the density of the German road network.

👩‍💻 💻 👨‍💻 Point at self-hosted servers 🖥 💻 🖥

You can now use the CLI with a self-hosted version of Data Hub running locally or on a server. (You knew you could run Data Hub locally using Docker, right?) Just specify the URL, localhost or otherwise, where it's running:

here configure server [url]

🛠🔨🗜 Fixed ⚙️🔩⛏

Better token management for spatial search results

In v1.6, we tightened the scope of the tokens we generate. This made it challenging to view the results of spatial searches of one space that pulled a feature from a second space. We now generate a token that grants read access to both spaces (similar to how we do it for the upstream spaces used in Virtual Spaces).

Ubuntu node environment fixes

Some Ubuntu users had reported node environment errors following a CLI upgrade. This should be behaving better now.

1.6.1

09 Oct 09:31
Compare
Choose a tag to compare

This release fixes an issue with displaying help for 'here xyz' command and sub-commands.

1.6.0

09 Oct 06:52
Compare
Choose a tag to compare

Version 1.6 of the Data Hub CLI makes it even easier to upload your data, with support for zipped shapefiles and xls/xlsx files, as well as batch uploads! It also has enhancements to the groupby option, and here xyz join has a new property search feature that makes virtual spaces even easier to use. Also, tokens we generate for show are now temporary by default.

💾 💿 📼 Upload enhancements 🔢 🔤 🔠

More filetypes, batch uploads, and better error detection and correction!

  • you can now upload zipped shapefiles and Excel spreadsheets (.xls,.xlsx)
  • have 100 GeoJSON files sitting in a folder? batch uploads are now possible! upload -f /directory/path/ --batch filetype
  • you can also upload multiple files or URLs at once using comma separated paths + filenames -- here xyz upload -f file1,file2,/path1/file3,/path2/file4 or here xyz upload -f "url1","url2","url3"
  • if a streaming upload is failing because a chunk is too large for the API gateway, we automatically reduce the size of the chunk until it successfully uploads, or we determine that the feature on its own is too large, in which case we notify you and move along to the next chunk
  • we notify you if coordinates in a feature are > 180 or < -180 and it cannot be uploaded

changes in how we assign the Feature ID:

  • We've changed the default behavior of the CLI during upload to respect a GeoJSON feature ID if it is present. (Previously we made it a property hash by default, but we've seen an evolution in user behavior where more of you have actually unique and meaningful IDs.) As before, if there is no feature ID, the CLI will generate a hash of the feature properties and use that as the feature ID. This means if there are duplicate features, only one will be uploaded.
  • -o now overrides an existing feature ID and generate a hash of the properties to use as the ID, or you can choose an existing property/properties as the feature ID with -i -- these options are useful when you are uploading multiple datasets with unique features but where the feature IDs overlap, especially from public data portals.

👨👧👶 --groupby enhancements 👫 👨‍👦‍👦 👨‍👩‍👦‍👦

  • --flatten: instead of a nested object, create a string delimited by : to reflect the logical hierarchy
  • --promote: hoists a list of properties that don't need to be repeated in each grouped feature (e.g. the name of a region in electoral riding results, grouped by party)
  • you can learn more about the power of groupby

👫 👭 👬 join enhancements 👨‍👩‍👧‍👦 👨‍👩‍👧 👨‍👩‍👧‍👧

  • use Data Hub's property search to create a Virtual Space based on the feature ID in one space and a property in another space. (If a property is not found, we add a no_match tag)
  • use --filter to restrict the property search by yet another property (e.g. only look for counties in a particular state in a national dataset where county names are not unique)

📗 ✏️ 🚫 set a space to read-only 📒 ✏️ 🚫

  • here xyz config --readonly true
  • useful in preventing unintended modifications to an upstream virtual space

🚀 🛰 👽 space specific tokens 🚀 🛰 👽

  • show -w and show -v now generate a token just for that space. By default, a temporary token lasting 48 hours is generated. You can generate a permanent token for that space using --permanent or -x

1.5.1

04 Aug 18:47
Compare
Choose a tag to compare

This minor version brings support for HERE Studio projects:

  • here studio list - lists your HERE Studio projects
  • here studio show <project-id> - launches a published project in the browser
  • here studio delete <project-id> - to delete a project

1.5.0

22 Jun 18:38
Compare
Choose a tag to compare

Just in time for summer, v1.5.0 of the Data Hub CLI is here with new features and bug fixes that will help your geospatial data kick back and relax in a comfortable space.

🔤🔤🔤 CSV Group by 🔢🔢🔢

--groupby columnName consolidates multiple rows of a CSV that share a unique ID into a single feature (designated with -i (usually representing a admin geography)); values in each row within that selected column will be grouped as nested properties within an object named after the column in the consolidated feature properties.

-groupby can be used with upload or the join command to extract the heirarchy from a CSV and upload it to a space without geometries.

  • with join, the data is uploaded and the virtual space with the geometry space is created in one step.
  • upload --groupby is useful for updating the "data space" in a virtual space that has already been created. It can also be used to upload the grouped data before a virtual space has created with a space containing geometries matching geoIDs using here xyz vs -a

This feature is best illustrated by election data, census data and time series data. One example is COVID-19 data from the Covid Tracking project API.

here xyz join xkRyxQl9 -f https://covidtracking.com/api/v1/states/daily.csv --noCoords -i state --groupby date

This will merge daily state testing data from March 2020 into a virtual space with xkRyxQl9, a shared space with US state geometries.

Date tags and properties

We've added --dateprops to the --date option, meaning we now let you save your time slices as new properties in the feature as well as tags -- these are prefixed by xyz_propertyname_.

Also, while converting the date string, we no longer add a time zone offset to the ISO 8601 timestamp as this caused problems when the data wasn't collected in your timezone.

📜💾⭐️ CLI history ⭐️💾📜

You're working with that complex dataset, and you just created the perfect upload command. It is so good, the data in your space is excellent, and your map is happy. Then you go back to update that space six months later… and you have completely forgotten the options. Was it -i? What chunk size did I use? How did I make that tag?

Well, never fear, upload --history is here. It automatically records the last three upload commands in the space definition, and it even lets you --save one as a --fav.

If you've done at least one upload to a space, you can recall it using an interactive menu:

here xyz upload spaceid --history

You can pick an upload command to save with --history save, and then re-run that particular command using --history fav. You can also --clear your history.

🚚🚚🚚 Batch upload 🚢🚢🚢

If you have a folder full of geospatial files, you can upload them all to a space in one command with --batch. Just specify the directory with -f and the filetype after --batch: geojson,geojsonl,csv, shp,json, or gpx and watch your files fly into Data Hub.

While uploading shapefiles,--batch will inspect directories in that specificed directory and look for .shp and all the other files you get when uncompressing a zipped shapefile. (This is handy, say when you've downloaded say 50 state shapefiles from the US Census website. We've all been there.)

🔍🔍🔍 Data Hub Console 🔎🔎🔎

Both config and token have a new --console option that opens up the new Data Hub console in a web browser.

🐛🐞🦟 Bug fixes and other enhancements 🛠🔧🔩

  • inline help was improved, along with online documentation
  • here xyz list --filter does a better job of handling null title and description fields
  • GeoJSON features that are not in a featureCollection can once again be uploaded
  • config -r now outputs properly formatted json
  • confirmation prompt added to config --shared true
  • join -k was changed to join -i to be more consistent with upload -i
  • we fixed a bug in activity log creation -- we were sending state as a string, but the API expected a number
  • we fixed some issues while streaming voronoi and delaunay polygons
  • show -r no longer wraps geojsonl output in a featureCollection
  • we set skipCache=true for /statistics and /space GET calls so you get the latest and greatest metadata

1.4.0

30 Apr 13:42
Compare
Choose a tag to compare

1.4.0 is here, and so are you! In addition to various bug fixes and security updates, we also bring you normalized date strings, better list options, .json import, and easier export!

⏰⏲️📅 Date strings, time stamps, and date tags 🕰️⌛⏱️

If you have a field or property that has a date-like field or a timestamp, you can normalize it using --date propertyname. We will try to convert it to an ISO 8601 friendly value and save it in xyz_iso8601_propertyname, along with a unix seconds timestamp, xyz_timestamp_propertyname for your time-slicing needs. (UTC for now.)
If you add the --datetag option, we will calculate human-friendly tag buckets such as year@2020, month@april, weekday@thursday, week@21. You can also limit which tags you want using arguments like --datetag month. (Again, these are UTC-based for now, timezones are hard)..

😎🕶️🎧 Find that space using filters 🎧🕶️😎

You have a lot of spaces. They are all important and you love them equally. But you need to find that special one. Why not use here xyz list --filter theoneyouarelookingfor to narrow down your list based on what you added to the title and description?

{{{JSON import}}}

Just like a square is a rectangle, but a rectangle is not a square, GeoJSON is JSON, but JSON is not GeoJSON. But if your JSON file contains (non-nested) lat/lon coordinates, you can now import those non-geojson json files just like you can with a CSV.

🚀📡💫 Get more out of your space 💫📡🚀

show --raw used to be limited to 5000 features, but if you need more, try --all (this can get long, and you can redirect to a file using >)
• you can also export to GeoJSONL via show --raw using, you guessed it, show --raw --geojsonl

Take care and we'd love to hear from you in issues (because we have issues).