-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
memory leak #114
Comments
seems it's in findnodes, just parsing the document doesn't make memory climb so fast. i'm trying to track it down with valgrind |
After building/compiling with:
Output:
So libxml2 objects are not not getting released, which is supposed to happen when Raku objects get destroyed (e.g. |
I added a count of calls to LibXML::Node
And added this to the script: So the issue seems to be on the Raku side. DESTROY is being called and presumably the Raku objects aren't being GC. |
I think this is similar to #85. Both are parsing a document, then performing an XPath query. |
I'm eyeballing 92e3792 again, which somehow triggered the earlier leak. Edit: one thing in particular to be careful off is accidental closures in native callbacks. These cause memory leaks. |
Even a simple use HTTP::UserAgent;
use LibXML::Document;
my HTTP::UserAgent $ua .= new;
my $html = $ua.get('https://nokogiri.org/tutorials/installing_nokogiri.html').content;
my $start-time = now;
for ^15000 {
my LibXML::Document $doc .= parse: :html, :recover, :suppress-errors, :string($html);
$doc.first('//article//h2');
}
$*ERR.say: "done in: {now - $start-time} seconds"; |
With the following short script to parse a html page and do an xpath search, memory usage will continually climb until it's killed for out of memory: https://gist.github.com/bo-tato/fe8194f53be5061a43af5264a8ec3f66
The text was updated successfully, but these errors were encountered: