You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @mnako ,
good job, i have a question for a big problem that affects lot of mail parser projects, did you try to parse big attachments in a project that analyze message with concurrent jobs ?
I see that you used ioutil.ReadAll in all decoders function, this works well if you are going to parse file with 10Mb to 30Mb but if you try it with 50+ Mb file size this program will eat all the os resource.
I tried this way in a past project, garbage collector offered by golang is good but if you are going to parse concurrently ( 10 to 100 message per second ), it will kill your os.
That is a very good suggestion. I still need to fix some known bugs (like the parser crashing if there is no plaintext) but I will get back to that idea as soon as we hit a stable 1.0 release.
Hi @mnako ,
good job, i have a question for a big problem that affects lot of mail parser projects, did you try to parse big attachments in a project that analyze message with concurrent jobs ?
I see that you used ioutil.ReadAll in all decoders function, this works well if you are going to parse file with 10Mb to 30Mb but if you try it with 50+ Mb file size this program will eat all the os resource.
I tried this way in a past project, garbage collector offered by golang is good but if you are going to parse concurrently ( 10 to 100 message per second ), it will kill your os.
I suggest you to implement a different way : https://kgrz.io/reading-files-in-go-an-overview.html
The text was updated successfully, but these errors were encountered: