用Pipeline对象取不到Redis里的值的问题AttributeError: ‘Pipeline‘ object has no attribute ‘items‘

报错显示AttributeError: 'Pipeline' object has no attribute 'items'

Redis 的 Pipeline 对象是一种优化 Redis 性能的工具,允许你在一次网络往返中批量发送多个命令,从而减少网络延迟。

这是因为用Pipeline对象操作redis时,对于存入数据和获取数据的方法稍有不同:

存入数据时,可以直接用Pipeline对象.存入方法  然后再用Pipeline对象.execute()执行就好

而获取数据时,获取到的数据需要在Pipeline对象.execute()这一步才能获取到,而我之前是在Pipeline对象.获取方法那一步获取的 是获取不到的

所以正确的写法应该是:

Traceback (most recent call last): File "G:\python310\lib\site-packages\twisted\internet\defer.py", line 2017, in _inlineCallbacks result = context.run(gen.send, result) File "G:\python310\lib\site-packages\scrapy\crawler.py", line 152, in crawl self.engine = self._create_engine() File "G:\python310\lib\site-packages\scrapy\crawler.py", line 166, in _create_engine return ExecutionEngine(self, lambda _: self.stop()) File "G:\python310\lib\site-packages\scrapy\core\engine.py", line 102, in __init__ self.scraper: Scraper = Scraper(crawler) File "G:\python310\lib\site-packages\scrapy\core\scraper.py", line 101, in __init__ self.itemproc: ItemPipelineManager = itemproc_cls.from_crawler(crawler) File "G:\python310\lib\site-packages\scrapy\middleware.py", line 77, in from_crawler return cls._from_settings(crawler.settings, crawler) File "G:\python310\lib\site-packages\scrapy\middleware.py", line 86, in _from_settings mwcls = load_object(clspath) File "G:\python310\lib\site-packages\scrapy\utils\misc.py", line 76, in load_object raise NameError(f"Module '{module}' doesn't define any object named '{name}'") builtins.NameError: Module 'scrapy.pipelines' doesn't define any object named 'filesFilesPipeline' 2025-12-23 14:46:39 [twisted] CRITICAL: Traceback (most recent call last): File "G:\python310\lib\site-packages\scrapy\utils\misc.py", line 74, in load_object obj = getattr(mod, name) AttributeError: module 'scrapy.pipelines' has no attribute 'filesFilesPipeline' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "G:\python310\lib\site-packages\twisted\internet\defer.py", line 2017, in _inlineCallbacks result = context.run(gen.send, result) File "G:\python310\lib\site-packages\scrapy\crawler.py", line 152, in crawl self.engine = self._create_engine() File "G:\python310\lib\site-packages\scrapy\crawler.py", line 166, in _create_engine return ExecutionEngine(self, lambda _: self.stop()) File "G:\python310\lib\site-packages\scrapy\core\engine.py", line 102, in __init__ self.scraper: Scraper = Scraper(crawler) File "G:\python310\lib\site-packages\scrapy\core\scraper.py", line 101, in __init__ self.itemproc: ItemPipelineManager = itemproc_cls.from_crawler(crawler) File "G:\python310\lib\site-packages\scrapy\middleware.py", line 77, in from_crawler return cls._from_settings(crawler.settings, crawler) File "G:\python310\lib\site-packages\scrapy\middleware.py", line 86, in _from_settings mwcls = load_object(clspath) File "G:\python310\lib\site-packages\scrapy\utils\misc.py", line 76, in load_object raise NameError(f"Module '{module}' doesn't define any object named '{name}'") NameError: Module 'scrapy.pipelines' doesn't define any object named 'filesFilesPipeline'
最新发布
12-24
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值