scrapyd-deploy

本文详细分析了在使用Scrapy框架进行爬虫开发时遇到的一个具体错误,涉及Twisted库和HTTP请求处理过程中的异常。通过跟踪错误堆栈,定位到问题出现在加载蜘蛛列表时的运行时错误,为开发者提供了深入理解Scrapy内部机制的机会。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Traceback (most recent call last):
	  File "/usr/local/lib64/python3.6/site-packages/twisted/web/http.py", line 2190, in allContentReceived
	    req.requestReceived(command, path, version)
	  File "/usr/local/lib64/python3.6/site-packages/twisted/web/http.py", line 917, in requestReceived
	    self.process()
	  File "/usr/local/lib64/python3.6/site-packages/twisted/web/server.py", line 198, in process
	    self.render(resrc)
	  File "/usr/local/lib64/python3.6/site-packages/twisted/web/server.py", line 258, in render
	    body = resrc.render(self)
	--- <exception caught here> ---
	  File "/usr/local/lib/python3.6/site-packages/scrapyd/webservice.py", line 21, in render
	    return JsonResource.render(self, txrequest).encode('utf-8')
	  File "/usr/local/lib/python3.6/site-packages/scrapyd/utils.py", line 20, in render
	    r = resource.Resource.render(self, txrequest)
	  File "/usr/local/lib64/python3.6/site-packages/twisted/web/resource.py", line 250, in render
	    return m(request)
	  File "/usr/local/lib/python3.6/site-packages/scrapyd/webservice.py", line 86, in render_POST
	    spiders = get_spider_list(project, version=version)
	  File "/usr/local/lib/python3.6/site-packages/scrapyd/utils.py", line 137, in get_spider_list
	    raise RuntimeError(msg.encode('unicode_escape') if six.PY2 else msg)
	builtins.RuntimeError: Traceback (most recent call last):
	  File "/usr/lib64/python3.6/runpy.py", line 193, in _run_module_as_main
	    "__main__", mod_spec)
	  File "/usr/lib64/python3.6/runpy.py", line 85, in _run_code
	    exec(code, run_globals)
	  File "/usr/local/lib/python3.6/site-packages/scrapyd/runner.py", line 40, in <module>
	    main()
	  File "/usr/local/lib/python3.6/site-packages/scrapyd/runner.py", line 37, in main
	    execute()
	  File "/usr/local/lib64/python3.6/site-packages/scrapy/cmdline.py", line 149, in execute
	    cmd.crawler_process = CrawlerProcess(settings)
	  File "/usr/local/lib64/python3.6/site-packages/scrapy/crawler.py", line 249, in __init__
	    super(CrawlerProcess, self).__init__(settings)
	  File "/usr/local/lib64/python3.6/site-packages/scrapy/crawler.py", line 137, in __init__
	    self.spider_loader = _get_spider_loader(settings)
	  File "/usr/local/lib64/python3.6/site-packages/scrapy/crawler.py", line 336, in _get_spider_loader
	    return loader_cls.from_settings(settings.frozencopy())
	  File "/usr/local/lib64/python3.6/site-packages/scrapy/spiderloader.py", line 61, in from_settings
	    return cls(settings)
	  File "/usr/local/lib64/python3.6/site-packages/scrapy/spiderloader.py", line 25, in __init__
	    self._load_all_spiders()
	  File "/usr/local/lib64/python3.6/site-packages/scrapy/spiderloader.py", line 47, in _load_all_spiders
	    for module in walk_modules(name):
	  File "/usr/local/lib64/python3.6/site-packages/scrapy/utils/misc.py", line 71, in walk_modules
	    submod = import_module(fullpath)
	  File "/usr/lib64/python3.6/importlib/__init__.py", line 126, in import_module
	    return _bootstrap._gcd_import(name[level:], package, level)
	  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
	  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
	  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
	  File "<frozen importlib._bootstrap>", line 656, in _load_unlocked
	  File "<frozen importlib._bootstrap>", line 626, in _load_backward_compatible
	  File "/tmp/scrapy_splash_test-1543565032-kng4ox3j.egg/******", line 12, in <module>
	  File "/tmp/scrapy_splash_test-1543565032-kng4ox3j.egg/******.py", line 21, in ******
	  File "/usr/lib64/python3.6/pkgutil.py", line 634, in get_data
	    return loader.get_data(resource_name)
	OSError: [Errno 0] Error: '**********************'

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值