Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request : big attachment vs concurrent parsing #14

Open
mgarbin opened this issue Jul 15, 2022 · 1 comment
Open

Feature request : big attachment vs concurrent parsing #14

mgarbin opened this issue Jul 15, 2022 · 1 comment

Comments

@mgarbin
Copy link

mgarbin commented Jul 15, 2022

Hi @mnako ,
good job, i have a question for a big problem that affects lot of mail parser projects, did you try to parse big attachments in a project that analyze message with concurrent jobs ?
I see that you used ioutil.ReadAll in all decoders function, this works well if you are going to parse file with 10Mb to 30Mb but if you try it with 50+ Mb file size this program will eat all the os resource.
I tried this way in a past project, garbage collector offered by golang is good but if you are going to parse concurrently ( 10 to 100 message per second ), it will kill your os.

I suggest you to implement a different way : https://kgrz.io/reading-files-in-go-an-overview.html

@mnako
Copy link
Owner

mnako commented Aug 6, 2022

Hi @mgarbin , and thank you for the message.

That is a very good suggestion. I still need to fix some known bugs (like the parser crashing if there is no plaintext) but I will get back to that idea as soon as we hit a stable 1.0 release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants