You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
### `StacktraceMemoryBatchsizeCache`: Stacktrace & Available Memory (*the default*)
81
81
82
82
This memorizes the successful batchsizes for a given call trace and available memory at that point.
83
-
For most machine learning code this is sufficient to know the right batchsize without having to look at the actual arguments and understanding more of the semantics.
83
+
For most machine learning code, this is sufficient to remember the right batchsize without having to look at the actual arguments and understanding more of the semantics.
84
84
85
-
The implicit assumption is that after a few iterations a stable state will be reached in regards to memory usage.
85
+
The implicit assumption is that after a few iterations a stable state will be reached in regards to GPU and CPU memory usage.
86
+
87
+
To limit the CPU memory of the process, toma provides:
88
+
```python
89
+
import toma.cpu_memory
90
+
91
+
toma.cpu_memory.set_cpu_memory_limit(8)
92
+
```
93
+
This can also be useful to avoid accidental swap thrashing.
0 commit comments