-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add subcommand for GCP #2
Comments
I did some brief exploration without arriving at the goal of extracting the credentials from a jupyter server started in a 2i2c k8s based server to use from my laptop. First here is a basic Python command to trial access. from google.cloud import storage
blobs = storage.Client().list_blobs("<some globally unique google storage bucket name>")
print(blobs.__next__().name)
|
Note to myself: if this feature is implemented, Arpita would be happy to help trial it out, and I can connect back to Arpita via https://2i2c.freshdesk.com/a/tickets/322. |
To use an access token directly is possible in three ways as outlined here, and summarized by me below: Declaring the CLOUDSDK_AUTH_ACCESS_TOKEN environment variable, see https://cloud.google.com/sdk/docs/authorizing --access_token_file flag, see https://cloud.google.com/sdk/gcloud/reference#--access-token-file. Configuration of access_token_file, see https://cloud.google.com/sdk/gcloud/reference/config/set and search for access_token_file I think what we should do is to trial and document the procedure of using an access token directly via |
I tried some things using Google Cloud SDK 413.0.0 and the Python client 1. Options work with
|
Docs on how to generate Access tokens are valid for one hour by default as described in the example on requesting an access token using the REST api directly, see https://cloud.google.com/iam/docs/create-short-lived-credentials-direct#rest_2. The lifetime can be specified only when using the REST API directly (not using It seems that |
Thanks for digging and documenting @consideRatio I can confirm it all seems to be working on the pangeo GCP jupyterhub: Can confirm this works on pangeo-cloud GCP Hub :
The last command is really handy (-r for recursive copy and -n for “no clobber” if the path already exists, and it was quite fast!). cc @jbusecke |
Currently things only work for AWS as noted in the readme
The text was updated successfully, but these errors were encountered: