python 3.x - Scrapy Logging Level Change -
i'm trying start scrapy spider scripty shown in here
logging.basicconfig( filename='log.txt', format='%(levelname)s: %(message)s', level=logging.critical ) configure_logging(install_root_handler=false) process = crawlerprocess(get_project_settings()) process.crawl('1740') process.start() # script block here until crawling finished
i want configure logging level of spider if not install root logger handler , configure basic config logging.basicconfig method not obey determinded level.
info: enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.httperrormiddleware', 'scrapy.spidermiddlewares.offsite.offsitemiddleware', 'scrapy.spidermiddlewares.referer.referermiddleware', 'scrapy.spidermiddlewares.urllength.urllengthmiddleware', 'scrapy.spidermiddlewares.depth.depthmiddleware'] info: enabled item pipelines: ['collector.pipelines.collectorpipeline'] info: spider opened info: crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
it following format , file name determined in basicconfig not use logging level. not determine logging level other place.
note: there not other place import logging or change logging level.
for scrapy should define logging settings in settings.py
as described in docs
so in settings.py
can set:
log_level = 'error' # display errors log_format = '%(levelname)s: %(message)s' log_file = 'log.txt'
Comments
Post a Comment