2019-03-01 09:00:44 [twisted] CRITICAL: Unhandled error in Deferred:

本文提供了一个简单的步骤来更新Twisted库,使用pip3命令在Windows平台上进行升级,确保你的Python环境保持最新。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

重新更新你的twisted即可

pip3 install --upgrade twisted[windows_platform]
V2025-03-18 15:57:52 [scrapy.utils.log] INFO: Scrapy 2.11.2 started (bot: weibo) 2025-03-18 15:57:52 [scrapy.utils.log] INFO: Versions: lxml 5.3.1.0, libxml2 2.11.7, cssselect 1.2.0, parsel 1.9.1, w3lib 2.2.1, Twisted 24.11.0, Python 3.8.5 (t ags/v3.8.5:580fbb0, Jul 20 2020, 15:57:54) [MSC v.1924 64 bit (AMD64)], pyOpenSSL 25.0.0 (OpenSSL 3.4.1 11 Feb 2025), cryptography 44.0.1, Platform Windows-10-10.0.22621-SP0 2025-03-18 15:57:52 [weibo_comment] INFO: Reading start URLs from redis key 'weibo_comment:start_urls' (batch size: 16, encoding: utf-8) 2025-03-18 15:57:52 [scrapy.addons] INFO: Enabled addons: [] 2025-03-18 15:57:52 [asyncio] DEBUG: Using selector: SelectSelector 2025-03-18 15:57:52 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor 2025-03-18 15:57:52 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.windows_events._WindowsSelectorEventLoop 2025-03-18 15:57:52 [scrapy.extensions.telnet] INFO: Telnet Password: ed3efe598fe58086 2025-03-18 15:57:52 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.logstats.LogStats'] 2025-03-18 15:57:52 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'weibo', 'DOWNLOAD_DELAY': 2, 'DUPEFILTER_CLASS': 'scrapy_redis.dupefilter.RFPDupeFilter', 'FEED_EXPORT_ENCODING': 'utf-8', 'NEWSPIDER_MODULE': 'weibo.spiders', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'ROBOTSTXT_OBEY': True, 'SCHEDULER': 'scrapy_redis.scheduler.Scheduler', 'SPIDER_MODULES': ['weibo.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} Unhandled error in Deferred: 2025-03-18 15:57:52 [twisted] CRITICAL: Unhandled error in Deferred: Traceback (most recent call last): File "e:\python\lib\site-packages\twisted\internet\defer.py", line 2017, in _inlineCallbacks result = context.run(gen.send, result) File "e:\python\lib\site-packages\scrapy\crawle
03-19
PS D:\conda_Test\baidu_spider\baidu_spider> scrapy crawl baidu -o realtime.csv 2025-06-26 20:37:39 [scrapy.utils.log] INFO: Scrapy 2.11.1 started (bot: baidu_spider) 2025-06-26 20:37:39 [scrapy.utils.log] INFO: Versions: lxml 5.2.1.0, libxml2 2.13.1, cssselect 1.2.0, parsel 1.8.1, w3lib 2.1.2, Twisted 23.10.0, Python 3.12.7 | packaged by Anaconda, Inc. | (main, Oct 4 2024, 13:17:27) [MSC v.1929 64 bit (AMD64)], pyOpenSSL 24.2.1 (OpenSSL 3.0.16 11 Feb 2025), cryptography 43.0.0, Platform Windows-11-10.0.22631-SP0 2025-06-26 20:37:39 [scrapy.addons] INFO: Enabled addons: [] 2025-06-26 20:37:39 [asyncio] DEBUG: Using selector: SelectSelector 2025-06-26 20:37:39 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.asyncioreactor.AsyncioSelectorReactor 2025-06-26 20:37:39 [scrapy.utils.log] DEBUG: Using asyncio event loop: asyncio.windows_events._WindowsSelectorEventLoop 2025-06-26 20:37:39 [scrapy.extensions.telnet] INFO: Telnet Password: 40e94de686f0a93d 2025-06-26 20:37:39 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats'] 2025-06-26 20:37:39 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'baidu_spider', 'FEED_EXPORT_ENCODING': 'utf-8', 'NEWSPIDER_MODULE': 'baidu_spider.spiders', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'ROBOTSTXT_OBEY': True, 'SPIDER_MODULES': ['baidu_spider.spiders'], 'TWISTED_REACTOR': 'twisted.internet.asyncioreactor.AsyncioSelectorReactor'} 2025-06-26 20:37:40 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware', 'scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2025-06-26 20:37:40 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2025-06-26 20:37:40 [scrapy.middleware] INFO: Enabled item pipelines: ['baidu_spider.pipelines.BaiduSpiderPrintPipeline', 'baidu_spider.pipelines.BaiduSpiderPipeline'] 2025-06-26 20:37:40 [scrapy.core.engine] INFO: Spider opened 2025-06-26 20:37:40 [scrapy.core.engine] INFO: Closing spider (shutdown) 2025-06-26 20:37:40 [baidu] INFO: 执行了close_spider方法,项目已经关闭 2025-06-26 20:37:40 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method CoreStats.spider_closed of <scrapy.extensions.corestats.CoreStats object at 0x000001BB483C0470>> Traceback (most recent call last): File "D:\anaconda3\Lib\site-packages\scrapy\crawler.py", line 160, in crawl yield self.engine.open_spider(self.spider, start_requests) NameError: name 'baidu_spider' is not defined During handling of the above exception, another exception occurred: Traceback (most recent call last): File "D:\anaconda3\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\anaconda3\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\anaconda3\Lib\site-packages\scrapy\extensions\corestats.py", line 30, in spider_closed elapsed_time = finish_time - self.start_time TypeError: unsupported operand type(s) for -: 'datetime.datetime' and 'NoneType' 2025-06-26 20:37:40 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'log_count/DEBUG': 3, 'log_count/ERROR': 1, 'log_count/INFO': 9} 2025-06-26 20:37:40 [scrapy.core.engine] INFO: Spider closed (shutdown) Unhandled error in Deferred: 2025-06-26 20:37:40 [twisted] CRITICAL: Unhandled error in Deferred: Traceback (most recent call last): File "D:\anaconda3\Lib\site-packages\twisted\internet\defer.py", line 874, in callback self._startRunCallbacks(result) File "D:\anaconda3\Lib\site-packages\twisted\internet\defer.py", line 981, in _startRunCallbacks self._runCallbacks() File "D:\anaconda3\Lib\site-packages\twisted\internet\defer.py", line 1075, in _runCallbacks current.result = callback( # type: ignore[misc] File "D:\anaconda3\Lib\site-packages\twisted\internet\defer.py", line 1946, in _gotResultInlineCallbacks _inlineCallbacks(r, gen, status, context) --- <exception caught here> --- File "D:\anaconda3\Lib\site-packages\twisted\internet\defer.py", line 2000, in _inlineCallbacks result = context.run(gen.send, result) File "D:\anaconda3\Lib\site-packages\scrapy\crawler.py", line 160, in crawl yield self.engine.open_spider(self.spider, start_requests) builtins.NameError: name 'baidu_spider' is not defined 2025-06-26 20:37:40 [twisted] CRITICAL: Traceback (most recent call last): File "D:\anaconda3\Lib\site-packages\twisted\internet\defer.py", line 2000, in _inlineCallbacks result = context.run(gen.send, result) File "D:\anaconda3\Lib\site-packages\scrapy\crawler.py", line 160, in crawl yield self.engine.open_spider(self.spider, start_requests) NameError: name 'baidu_spider' is not defined PS D:\conda_Test\baidu_spider\baidu_spider> 如何解决
最新发布
06-27
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值