-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
extra images are sometimes not deleted after abort #94
Comments
OK, now using SIGABRT, but finding that I need to background each image download command, then grab its PID, then kill that PID if it turns out we're going to abort that download. Then delete any partially downloaded image. This is becoming quite a "thing". Ugh! |
Also need to save output from backgrounded downloader regarding exit status and messages. |
Ugh... I can't reliably catch and kill each background process and ensure it hasn't already downloaded an image, but not yet marked it as complete. There's a narrow window of time where this can occur. I think I'll work on downloading images into temp files, then renaming those we want to keep. The temps can be deleted afterwards. The gallery builder should be coded to only work with non-temp image files. This should be a more consistent method. This is my first project dealing with asynchronous processes. It's been a learning curve. 😁 |
What the scenario here? That the user aborts the run? |
Yes, but aborting also happens during a regular run. That's the one I'm testing. When we have enough images, all the backgrounded downloaders are aborted, and we then need to cleanup any partially (or sometimes fully) downloaded image it may have created.
|
today's progress: getting there. 😄 I've written from scratch a new and quite simple version of the downloading functions used in googliser. For testing, it can run upto 500 instances of itself, each downloading the same remote file and saving it locally under different names. In operation, it launches as many forks as it can in 10 seconds (which turns out to be all 500 of them), and each one starts a backgrounded instance of It allows the downloaders to run uninterrupted for 10 seconds, then issues a SIGABRT to simulate the signal issued when googliser has received enough valid images. This instantly kills each copy of So far, it's working flawlessly, but is still being developed and is not yet ready to transfer into googliser. I'm hoping I'll get a chance to work on this over the Christmas holidays, along with fixing the brakes on the car, repairing the garage roof, etc... |
Might be time to look at
trap
ping an abort signal? Maybe use SIGABRT 6 ?Then modify the abort routine to send 6 instead of 9 to running jobs.
The text was updated successfully, but these errors were encountered: