You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The path gets handed over to a Riegeli FdReader (source) and the array_record maintainers have pointed me upstream. I can't find GCS support mentioned in the docs, but given TFDS provides a try_gcs argument, it seems like GCS buckets should be supported somewhere along the line.
The text was updated successfully, but these errors were encountered:
What's the reasoning for keeping it closed source? As a heavy user of Google Cloud Dataflow for large datasets processing, I'm surprised to get stuck on TensorFlow Datasets code that stopped working when writing TFRecord files to Google Cloud Storage (GCS), and was hoping that migrating to array_record would alleviate those pain points. Surprised to read that it's not clear that TFDS will get GCS support with array_record.
Following up on google/array_record/issues/120
The following code fails when it shouldn't
Colab link
The path gets handed over to a Riegeli
FdReader
(source) and thearray_record
maintainers have pointed me upstream. I can't find GCS support mentioned in the docs, but given TFDS provides atry_gcs
argument, it seems like GCS buckets should be supported somewhere along the line.The text was updated successfully, but these errors were encountered: