I want to run scrapy from a single script and I want to get all settings from settings.py but I would like to be able to change some of them:
from scrapy.crawler import CrawlerProcess from scrapy.utils.project import get_project_settings process = CrawlerProcess(get_project_settings()) *### so what im missing here is being able to set or override one or two of the settings###* # 'followall' is the name of one of the spiders of the project. process.crawl('testspider', domain='scrapinghub.com') process.start() # the script will block here until the crawling is finished
I wasn’t able to use this. I tried the following:
settings=scrapy.settings.Settings() settings.set('RETRY_TIMES',10)
but it didn’t work.
Note: I’m using the latest version of scrapy.
Advertisement
Answer
So in order to override some settings, one way would be overriding/setting custom_settings, the spider’s static variable, in our script.
so I imported the spider’s class and then override the custom_setting:
from testspiders.spiders.followall import FollowAllSpider FollowAllSpider.custom_settings={'RETRY_TIMES':10}
So this is the whole script:
from scrapy.crawler import CrawlerProcess from scrapy.utils.project import get_project_settings from testspiders.spiders.followall import FollowAllSpider FollowAllSpider.custom_settings={'RETRY_TIMES':10} process = CrawlerProcess(get_project_settings()) # 'followall' is the name of one of the spiders of the project. process.crawl('testspider', domain='scrapinghub.com') process.start() # the script will block here until the crawling is finished