-
Notifications
You must be signed in to change notification settings - Fork 162
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot override log related spider settings #10
Comments
Any solution would be fine, but here are the changes for the above: |
@andrewbaxter the idea behind using Priority 'cmdline' is used here because default ScrapyRT settings should have highest possible priority and override any project settings - ScrapyRT is relying on that. In opposite 'default' is a lowest possible priority and will be overridden by project settings. Don't think Scrapy's settings priority='default' applies here - overriding Scrapy's default setting can't cause harm, and here it can. I think it would be easier to be able to override default ScrapyRT spider settings from CrawlManager. This way you will be able to remove or change any setting ScrapyRT is forcing. |
@andrewbaxter oh, I think I missed one more option that doesn't require any changes to ScrapyRT - just override CrawlManager and set any settings you want here This method returns Scrapy Settings which you can easily update |
Overriding Also, CrawlManager overrides seem dependent on implementation details - if there's a chance that the implementation could change and silently break our code (ex: renaming one of those methods) it would be more reliable to create a local fork. Anyway, we're getting by right now, but I would appreciate some sort of supported channel for making log settings changes. |
@andrewbaxter I'm thinking about allowing Scrapy settings with prefix
and pass this config file to scrapyrt command
WDYT? |
may be related to #62 |
@pawelmhm, what do you think about @andrewbaxter's idea on changing the priority in |
Any plan for this? By default I find many log files in logs directory, it seems scrapyrt create one file for one request. It's somewhat unexpected. What's the best practice to config the log? |
AFAICT it's not possible to override LOG_LEVEL, LOG_FILE, LOG_DIR, etc for spiders because the dict from get_scrapyrt_settings is applied with priority 'cmdline'.
I assume this is due to conflicting goals:
My take is
scrapy.cfg is typically small enough that requiring the user to either copy it or use a template from the documentation wouldn't be a significant burden.
The text was updated successfully, but these errors were encountered: