You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When doing this homework, I found make test_example drained my disk space, and then panicked due to no space left on device.
After understanding the test code, I found the test code never cleanup the data generated after each testcase.
In my opinion, since the size of the data each test case generated is not ignorable, the test should remove them after each case.
Something like that:
// tidb/mapreduce/urltop10_test.gofunctestURLTop(t*testing.T, roundsRoundsArgs) {
// ......fork:=rangedataSize {
fori, gen:=rangegens {
// generate dataprefix:=dataPrefix(i, dataSize[k], nMapFiles[k])
// ......// added code here: clean up, or cleanup only when the case has passedos.RemoveAll(prefix)
}
}
}
Is there any good reason not doing this?
Or is the cleanup part of the homework?
在做这个作业的时候,我在运行 make test_example 发现生成的mr_homework占了大量磁盘空间,并且测试由于no space left on device panic了。
在理解了测试代码后,我发现测试代码并不在每个testcase之后清理生成的文件。(也许是希望随后由用户手动make cleanup?但这样就容易出现我上面的问题。)
在我看来,每个testcase生成的文件的大小并不小到可以忽略,是否考虑在每个testcase执行完(或者执行完且为pass)后立即清理掉这部分文件?(还是说在某处手动清理是作业的一部分?)
像这样:
When doing this homework, I found
make test_example
drained my disk space, and then panicked due tono space left on device
.After understanding the test code, I found the test code never cleanup the data generated after each testcase.
In my opinion, since the size of the data each test case generated is not ignorable, the test should remove them after each case.
Something like that:
Is there any good reason not doing this?
Or is the cleanup part of the homework?
在做这个作业的时候,我在运行
make test_example
发现生成的mr_homework
占了大量磁盘空间,并且测试由于no space left on device
panic了。在理解了测试代码后,我发现测试代码并不在每个testcase之后清理生成的文件。(也许是希望随后由用户手动
make cleanup
?但这样就容易出现我上面的问题。)在我看来,每个testcase生成的文件的大小并不小到可以忽略,是否考虑在每个testcase执行完(或者执行完且为pass)后立即清理掉这部分文件?(还是说在某处手动清理是作业的一部分?)
像这样:
The text was updated successfully, but these errors were encountered: