You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is just a abstract idea I had because I'm currently writing a serializer using this crate:
The pattern I hit right now is that to be able to calculate the final encoded length for write_der_header I have to wrap all tagged fields and resolve things like their lengths, recursively.
But I have to throw away that state directly afterwards, only to reconstruct it again in the directly following call to write_der_content.
It's not a big overhead by any means, but I was wondering if I could remove it regardless.
Is that something that'd you be interested in as well and/or would want to work on together?
An idea I had is to make ToDer have an associated type that stores what is in essence a weak normalized form. The basic idea is that the WNF can be encoded into bytes directly without conditionals and without having to allocate/wrap any fields.
That solution would however make ToDer not object-safe until GATs for trait objects land. Not sure if that is a significant usability problem though.
(Fun fact: The last time I had this issue in asnom I actually inverted encoding, writing out the bodies of fields first and the headers afterwards, making heavy use of scatter/gather-IO. But that approach was absurdly complex, hard to use, and not all that efficient in most cases anyway.)
The text was updated successfully, but these errors were encountered:
You are raising several good points here, on things that took me some time (and headaches) before getting to the current code. And I will humbly admit some better solution could exist :)
I'll reply here, but maybe it will be better to split the discussion on several issues.
About ToDer and length computation
I hesitated a long time between
a streaming implementation, which would not need to allocate but would require computing the length recursively before writing the object (and thus, walking the DER tree twice).
an allocating version. The code is much easier, though in some cases you have to allocate twice: once to serialize contents, and then you can compute length [1], and then you need to allocate again to write the header and the object
finally I settled on using a Write trait so there is only one allocation
[1] Note that the fact that DER length is itself a variable-length fields does not make it easier to compute, and prevents leaving some placeholder bytes that could be updating after the computation.
I am not sure I get your idea of the weak normalized form, yet I'm really interested so if there is a way to write a minimal example or just some pseudo code it would be like, I'd gladly have a look. We can also discuss it on IRC/discord if you prefer.
Associated types and GAT on traits
I find it funny you are raising that, because it is something I also struggled on for other parts of this crate. In particular, I tried changing the Error parameter of the FromBer trait into an associated type (which would be better imho, since usually there is only one error type for a parse function), but quickly jumped into very complex errors. Roughly, they are related to the following:
I need to defined bounds on the associated types
FromBer is automatically implemented for any type providing TryFrom<Any>, but also needs to be implemented for Any. If the error type becomes too generic, then the compiler refuses because there is a default implementation of TryFrom<T> for T.
I had other errors, and could never finally have something satisfying.
On that issue, I'd gladly take some advice/hints!
Note: for the ToDer trait, maybe the situation is different because the error is not generic. But it would be good to have a similar solution in the different traits used in that crate
This is just a abstract idea I had because I'm currently writing a serializer using this crate:
The pattern I hit right now is that to be able to calculate the final encoded length for
write_der_header
I have to wrap all tagged fields and resolve things like their lengths, recursively.But I have to throw away that state directly afterwards, only to reconstruct it again in the directly following call to
write_der_content
.It's not a big overhead by any means, but I was wondering if I could remove it regardless.
Is that something that'd you be interested in as well and/or would want to work on together?
An idea I had is to make
ToDer
have an associated type that stores what is in essence a weak normalized form. The basic idea is that the WNF can be encoded into bytes directly without conditionals and without having to allocate/wrap any fields.That solution would however make ToDer not object-safe until GATs for trait objects land. Not sure if that is a significant usability problem though.
(Fun fact: The last time I had this issue in asnom I actually inverted encoding, writing out the bodies of fields first and the headers afterwards, making heavy use of scatter/gather-IO. But that approach was absurdly complex, hard to use, and not all that efficient in most cases anyway.)
The text was updated successfully, but these errors were encountered: