win7环境scrapy输出错误日志报错解决方案

本文介绍在Windows 7系统中使用Scrapy爬虫框架时遇到的日志乱码问题及其解决方案。通过升级Python的logging组件,成功解决了在CMD窗口中出现的UnicodeDecodeError异常。

win7下调试scrapy代码时,出现代码报错,但是未输出到log日志,而是在cmd中报如下错误:

Traceback (most recent call last):
  File "d:\python27\lib\logging\__init__.py", line 884, in emit
    stream.write(fs % msg.encode("UTF-8"))
UnicodeDecodeError: 'gbk' codec can't decode bytes in position 1274-1275: illegal multibyte sequence
Logged from file scraper.py, line 158

 

各种尝试均告失败,后来在论坛中发现有人说这个bug在python3环境不存在,于是尝试升级python2.7的logging组件。

pip install --upgrade logging

 

升级logging后再无此报错出现。天真

 

PS C:\Users\童琪琪\Desktop\bishe.6\biyesheji.6\movie_analysis_project> scrapy crawl maoyan -O output.json >> 2025-11-14 20:28:27 [scrapy.utils.log] INFO: Scrapy 2.11.0 started (bot: movie_analysis_project) 2025-11-14 20:28:27 [scrapy.utils.log] INFO: Versions: lxml 5.2.1.0, libxml2 2.13.1, cssselect 1.2.0, parsel 1.8.1, w3lib 2.1.2, Twisted 22.10.0, Python 3.12.7 | packaged by Anaconda, Inc. | (main, Oct 4 2024, 13:17:27) [MSC v.1929 64 bit (AMD64)], pyOpenSSL 24.2.1 (OpenSSL 3.0.15 3 Sep 2024), cryptography 43.0.0, Platform Windows-11-10.0.26100-SP0 2025-11-14 20:28:27 [scrapy.addons] INFO: Enabled addons: [] 2025-11-14 20:28:27 [py.warnings] WARNING: D:\Anaconda\Lib\site-packages\scrapy\utils\request.py:254: ScrapyDeprecationWarning: '2.6' is a deprecated value for the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting. It is also the default value. In other words, it is normal to get this warning if you have not defined a value for the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting. This is so for backward compatibility reasons, but it will change in a future version of Scrapy. See the documentation of the 'REQUEST_FINGERPRINTER_IMPLEMENTATION' setting for information on how to handle this deprecation. return cls(crawler) 2025-11-14 20:28:27 [scrapy.extensions.telnet] INFO: Telnet Password: d7c1bc3df7b1886a 2025-11-14 20:28:27 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.telnet.TelnetConsole', 'scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.logstats.LogStats'] 2025-11-14 20:28:27 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'movie_analysis_project', 'CONCURRENT_REQUESTS': 1, 'DOWNLOAD_DELAY': 3, 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'movie_analysis_project.spiders', 'RETRY_TIMES': 3, 'SPIDER_MODULES': ['movie_analysis_project.spiders'], 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'} 2025-11-14 20:28:28 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2025-11-14 20:28:28 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] ✅ 成功连接到数据库 movie_analysis 2025-11-14 20:28:28 [scrapy.middleware] INFO: Enabled item pipelines: ['movie_analysis_project.pipelines.MySQLPipeline'] 2025-11-14 20:28:28 [scrapy.core.engine] INFO: Spider opened 2025-11-14 20:28:28 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2025-11-14 20:28:28 [scrapy.extensions.telnet] INFO: Telnet console listening on 127.0.0.1:6023 2025-11-14 20:28:29 [maoyan] ERROR: 解析电影列表失败: 'cat' 2025-11-14 20:28:29 [scrapy.core.engine] INFO: Closing spider (finished) 📦 数据库连接已关闭 2025-11-14 20:28:29 [scrapy.extensions.feedexport] INFO: Stored json feed (0 items) in: output.json 2025-11-14 20:28:29 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 421, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 3527, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'elapsed_time_seconds': 0.552445, 'feedexport/success_count/FileFeedStorage': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2025, 11, 14, 12, 28, 29, 285905, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 9346, 'httpcompression/response_count': 1, 'log_count/ERROR': 1, 'log_count/INFO': 11, 'log_count/WARNING': 1, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'start_time': datetime.datetime(2025, 11, 14, 12, 28, 28, 733460, tzinfo=datetime.timezone.utc)} 2025-11-14 20:28:29 [scrapy.core.engine] INFO: Spider closed (finished) PS C:\Users\童琪琪\Desktop\bishe.6\biyesheji.6\movie_analysis_project> 给出修改后的正确代码
最新发布
11-15
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值