宫村优子 -- It's only the fairy tale

部署运行你感兴趣的模型镜像
 
It's only the fairy tale  
只是传说
 
Who are those little girls in pain?   
那些正在痛苦的小女孩是谁?
Just trapped in castle of dark side of moon  
她们被捕获到月亮背面的黑暗城堡
Twelve of them shining bright in vain  
这十二个小女孩徒然地放出光彩
Like flowers that blossom just once in years  
像数年才绽放一次的花朵般
 
They're dancing in the shadow like whispers of love  
她们在阴影下如同那爱的低语般舞动着
Just dreaming of place where they're free as dove  
在梦中的他们才像鸽子般自由
They’ve never been allowed to love in this cursed cage  
在被诅咒的牢笼中他们永远不被允许去爱
It’s only the fairy tale they believe  
他们相信这只是传说

您可能感兴趣的与本文相关的镜像

Llama Factory

Llama Factory

模型微调
LLama-Factory

LLaMA Factory 是一个简单易用且高效的大型语言模型(Large Language Model)训练与微调平台。通过 LLaMA Factory,可以在无需编写任何代码的前提下,在本地完成上百种预训练模型的微调

>> PS C:\Users\童琪琪\Desktop\bishe.6\biyesheji.6\movie_analysis_project> PS C:\Users\童琪琪\Desktop\bishe.6\biyesheji.6\movie_analysis_project\movie_analysis_project> scrapy crawl maoyan -O output.json 2025-11-14 20:48:15 [scrapy.utils.log] INFO: Scrapy 2.11.0 started (bot: movie_analysis_project) 2025-11-14 20:48:15 [scrapy.utils.log] INFO: Versions: lxml 5.2.1.0, libxml2 2.13.1, cssselect 1.2.0, parsel 1.8.1, w3lib 2.1.2, Twisted 22.10.0, Python 3.12.7 | packaged by Anaconda, Inc. | (main, Oct 4 2024, 13:17:27) [MSC v.1929 64 bit (AMD64)], pyOpenSSL 24.2.1 (OpenSSL 3.0.15 3 Sep 2024), cryptography 43.0.0, Platform Windows-11-10.0.26100-SP0 2025-11-14 20:48:15 [scrapy.addons] INFO: Enabled addons: [] 2025-11-14 20:48:16 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.throttle.AutoThrottle'] 2025-11-14 20:48:16 [scrapy.crawler] INFO: Overridden settings: {'AUTOTHROTTLE_ENABLED': True, 'AUTOTHROTTLE_MAX_DELAY': 10, 'AUTOTHROTTLE_START_DELAY': 3, 'BOT_NAME': 'movie_analysis_project', 'CONCURRENT_REQUESTS': 1, 'CONCURRENT_REQUESTS_PER_DOMAIN': 1, 'DNSCACHE_SIZE': 1000, 'DOWNLOAD_DELAY': 3, 'DOWNLOAD_TIMEOUT': 10, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'movie_analysis_project.spiders', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'RETRY_HTTP_CODES': [500, 502, 503, 504, 408, 403, 400, 404], 'RETRY_TIMES': 3, 'SPIDER_MODULES': ['movie_analysis_project.spiders'], 'TELNETCONSOLE_ENABLED': False, 'USER_AGENT': 'Mozilla/5.0 (iPhone; CPU iPhone OS 17_5 like Mac OS X) ' 'AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148 ' 'MicroMessenger/8.0.40(0x18002831) NetType/WIFI Language/zh_CN'} 2025-11-14 20:48:17 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2025-11-14 20:48:17 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2025-11-14 20:48:17 [scrapy.middleware] INFO: Enabled item pipelines: ['movie_analysis_project.pipelines.MySQLPipeline'] 2025-11-14 20:48:17 [scrapy.core.engine] INFO: Spider opened 2025-11-14 20:48:17 [maoyan] INFO: ✅ 成功连接到数据库 movie_analysis 2025-11-14 20:48:17 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2025-11-14 20:48:17 [maoyan] INFO: ✅ 成功获取响应: https://m.maoyan.com/ajax/movieOnInfoList, 状态码: 200 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '花江夏树,鬼头明里,下野纮', 'category': '未知', 'director': '', 'release_date': '2025-11-14', 'score': 9.7, 'source': 'maoyan', 'title': '鬼灭之刃:无限城篇 第一章 猗窝座再袭'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '杰西·艾森伯格,伍迪·哈里森,戴夫·弗兰科', 'category': '未知', 'director': '', 'release_date': '2025-11-14', 'score': 0.0, 'source': 'maoyan', 'title': '惊天魔盗团3'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '陈平,路扬,董汶亮', 'category': '未知', 'director': '', 'release_date': '2025-08-02', 'score': 9.5, 'source': 'maoyan', 'title': '浪浪山小妖怪'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '李庚希,邓家佳,刘奕铁', 'category': '未知', 'director': '', 'release_date': '2025-10-31', 'score': 0.0, 'source': 'maoyan', 'title': '即兴谋杀'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '黄渤,范丞丞,殷桃', 'category': '未知', 'director': '', 'release_date': '2025-09-30', 'score': 9.6, 'source': 'maoyan', 'title': '浪浪人生'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '艾丽·范宁,迪米特里乌斯·舒斯特-科洛阿玛坦吉,Cameron Brown', 'category': '未知', 'director': '', 'release_date': '2025-11-07', 'score': 9.0, 'source': 'maoyan', 'title': '铁血战士:杀戮之地'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '易烊千玺,舒淇,赵又廷', 'category': '未知', 'director': '', 'release_date': '2025-11-22', 'score': 0.0, 'source': 'maoyan', 'title': '狂野时代'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '山姆·洛克威尔,马克·马龙,奥卡菲娜', 'category': '未知', 'director': '', 'release_date': '2025-08-16', 'score': 9.3, 'source': 'maoyan', 'title': '坏蛋联盟2'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '张枫,宋佳,朱亚文', 'category': '未知', 'director': '', 'release_date': '2025-09-30', 'score': 9.7, 'source': 'maoyan', 'title': '志愿军:浴血和平'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '绪方惠美,林原惠美,宫村', 'category': '未知', 'director': '', 'release_date': '2025-10-31', 'score': 9.4, 'source': 'maoyan', 'title': '天鹰战士:最后的冲击'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [scrapy.core.engine] INFO: Closing spider (finished) 2025-11-14 20:48:17 [maoyan] INFO: 📦 数据库连接已关闭 2025-11-14 20:48:17 [scrapy.extensions.feedexport] INFO: Stored csv feed (0 items) in: output.csv 2025-11-14 20:48:17 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 462, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 3453, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'elapsed_time_seconds': 0.813219, 'feedexport/success_count/FileFeedStorage': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2025, 11, 14, 12, 48, 17, 967632, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 9172, 'httpcompression/response_count': 1, 'item_scraped_count': 10, 'log_count/ERROR': 10, 'log_count/INFO': 12, 'log_count/WARNING': 10, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'start_time': datetime.datetime(2025, 11, 14, 12, 48, 17, 154413, tzinfo=datetime.timezone.utc)} 2025-11-14 20:48:17 [scrapy.core.engine] INFO: Spider closed (finished) PS C:\Users\童琪琪\Desktop\bishe.6\biyesheji.6\movie_analysis_project\movie_analysis_project>
11-15
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值