Replies: 1 comment
-
To write it more explicitly: I would vote to not do this. I.e. For compound types (e.g. ragged tensor), we can user other ways. E.g. see #466 for ragged tensors / packed arrays. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
TensorFlow has multiple tensor-like objects/types (which derive now from
tf.CompositeTensor
), namely (at least):tf.RaggedTensor
tf.SparseTensor
tf.IndexedSlices
We currently mostly don't use them at all. In almost all cases, you also don't need to, or can just temporarily create such object and then always work on bare tensors. Because in all cases, these composite tensors consists of bare tensors anyway.
Data.placeholder
is currently supposed to be atf.Tensor
.Should we maybe relax this to also allow
tf.CompositeTensor
there?For many (basic) TF operations, this should just work fine.
I'm not really sure where it breaks. Definitely for our own native ops. Maybe also things like
tf.matmul
.It might make some TF code simpler. (But which exactly?)
But then also
Data
maybe doesn't fit well. E.g.shape
of such a tensor might not always be well defined.Some operations (e.g.
flatten_with_seq_len
) don't really make sense.So maybe a bad idea. But I just wanted to start some thoughts/discussion on this.
Related: #466 (packed arrays, ragged tensor)
(
tf.Variable
is also somewhat special but really mostly exactly just liketf.Tensor
.)(Again not sure if the discussion forum is the right place for this. See #465.)
Beta Was this translation helpful? Give feedback.
All reactions