You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think we've touched on this issue before but I've ran back into it and I can't solve it. I'm trying to annotate a graph with coordinates and I keep getting the following error:
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted (core dumped)
I've taken this to mean I've ran out of memory so I've addressed it in numerous ways. I've used a server with a huge amount of RAM (500Gb), and a huge amount of disk space (1TB). I've also used "--disk-swap" and "--mem-cap-gb" to no avail. My data isn't that massive (around 2 GB) but I am trying to annotate by sequence header (of which there are thousands) and of course with coordinates so I'm wondering if this is why I'm running into issues? Do you think my set up need to be larger to cope with my request or is there something else you think might help?
If it helps, all of my metagraph installations that are throwing this issue were installed from source with no problems on a ubuntu server.
The text was updated successfully, but these errors were encountered:
Since you have many headers and columns, it's most likely that metagraph tries to allocate for each column buffers that are too large (1 GB by default), and hence, eventually runs out of RAM. Try to reduce the buffer size (pass --mem-cap-gb 0.001). Also, the data is quite small, so you can do everything without disk swap (skip the --disk-swap flag).
Hello again,
I think we've touched on this issue before but I've ran back into it and I can't solve it. I'm trying to annotate a graph with coordinates and I keep getting the following error:
I've taken this to mean I've ran out of memory so I've addressed it in numerous ways. I've used a server with a huge amount of RAM (500Gb), and a huge amount of disk space (1TB). I've also used "--disk-swap" and "--mem-cap-gb" to no avail. My data isn't that massive (around 2 GB) but I am trying to annotate by sequence header (of which there are thousands) and of course with coordinates so I'm wondering if this is why I'm running into issues? Do you think my set up need to be larger to cope with my request or is there something else you think might help?
If it helps, all of my metagraph installations that are throwing this issue were installed from source with no problems on a ubuntu server.
The text was updated successfully, but these errors were encountered: