人生三件事不能等----


 

人的一生,有三件事情不能等。

       第一是“贫穷”

贫穷不能等,因为一但时间久了,你将习惯贫穷,到时不但无法突破自我,甚至会抹杀了自己的梦想,而庸庸碌碌的过一辈子。。。。。。

       第二是“梦想”

梦想不能等,因为人生不同的阶段,会有不同的历练和想法,试想一个问题:如果你20岁时的梦想,在60岁的时候才得以实现,那会是什么样的一个情况???

譬如说你20岁时的梦想是希望能买到一辆法拉利的跑车,然后到德国的无限速公路狂飙。你一直努力工作,好不容易到60岁了,总算买得起跑车了,但要实现年轻时的梦想,恐怕也是心有余而力不足吧。。。。。。

       第三是“家人”

家人不能等,或许我们还年轻,未来有很多的时间可以让我们摸索、打拼,但是家人吗?他们还有时间等我们成功吗???还有时间等我们赚到钱,让他们过好日子,让他们以我们为荣???

 

树欲静而风不止,子欲养而亲不待。。。。。。这是很多人的痛,也是很多人一辈子的遗憾。

人的上半生:要不犹豫;

人的下半生:要不后悔;

活在当下,把握每次的机会,因为机会稍纵即逝,为自己的生命找到出路!
 
急事,慢慢的说;

大事,清楚的说;

小事,幽默的说;

没把握的事,谨慎的说;

没发生的事,不要胡说;

做不到的事,别乱说;

伤害人的事,不能说;

讨厌的事,对事不对人的说;

开心的事,看埸合说;

伤心的事,不要见人就说;

别人的事,小心的说;

自己的事,听听自己的心怎么说;

现在的事,做了再说;

未来的事,未来再说。

有缘人,共勉之!
 

>> PS C:\Users\童琪琪\Desktop\bishe.6\biyesheji.6\movie_analysis_project> PS C:\Users\童琪琪\Desktop\bishe.6\biyesheji.6\movie_analysis_project\movie_analysis_project> scrapy crawl maoyan -O output.json 2025-11-14 20:48:15 [scrapy.utils.log] INFO: Scrapy 2.11.0 started (bot: movie_analysis_project) 2025-11-14 20:48:15 [scrapy.utils.log] INFO: Versions: lxml 5.2.1.0, libxml2 2.13.1, cssselect 1.2.0, parsel 1.8.1, w3lib 2.1.2, Twisted 22.10.0, Python 3.12.7 | packaged by Anaconda, Inc. | (main, Oct 4 2024, 13:17:27) [MSC v.1929 64 bit (AMD64)], pyOpenSSL 24.2.1 (OpenSSL 3.0.15 3 Sep 2024), cryptography 43.0.0, Platform Windows-11-10.0.26100-SP0 2025-11-14 20:48:15 [scrapy.addons] INFO: Enabled addons: [] 2025-11-14 20:48:16 [scrapy.middleware] INFO: Enabled extensions: ['scrapy.extensions.feedexport.FeedExporter', 'scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.logstats.LogStats', 'scrapy.extensions.throttle.AutoThrottle'] 2025-11-14 20:48:16 [scrapy.crawler] INFO: Overridden settings: {'AUTOTHROTTLE_ENABLED': True, 'AUTOTHROTTLE_MAX_DELAY': 10, 'AUTOTHROTTLE_START_DELAY': 3, 'BOT_NAME': 'movie_analysis_project', 'CONCURRENT_REQUESTS': 1, 'CONCURRENT_REQUESTS_PER_DOMAIN': 1, 'DNSCACHE_SIZE': 1000, 'DOWNLOAD_DELAY': 3, 'DOWNLOAD_TIMEOUT': 10, 'FEED_EXPORT_ENCODING': 'utf-8', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'movie_analysis_project.spiders', 'REQUEST_FINGERPRINTER_IMPLEMENTATION': '2.7', 'RETRY_HTTP_CODES': [500, 502, 503, 504, 408, 403, 400, 404], 'RETRY_TIMES': 3, 'SPIDER_MODULES': ['movie_analysis_project.spiders'], 'TELNETCONSOLE_ENABLED': False, 'USER_AGENT': 'Mozilla/5.0 (iPhone; CPU iPhone OS 17_5 like Mac OS X) ' 'AppleWebKit/605.1.15 (KHTML, like Gecko) Mobile/15E148 ' 'MicroMessenger/8.0.40(0x18002831) NetType/WIFI Language/zh_CN'} 2025-11-14 20:48:17 [scrapy.middleware] INFO: Enabled downloader middlewares: ['scrapy.downloadermiddlewares.retry.RetryMiddleware', 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware', 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware', 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware', 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware', 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware', 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware', 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware', 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware', 'scrapy.downloadermiddlewares.stats.DownloaderStats'] 2025-11-14 20:48:17 [scrapy.middleware] INFO: Enabled spider middlewares: ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware', 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware', 'scrapy.spidermiddlewares.referer.RefererMiddleware', 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware', 'scrapy.spidermiddlewares.depth.DepthMiddleware'] 2025-11-14 20:48:17 [scrapy.middleware] INFO: Enabled item pipelines: ['movie_analysis_project.pipelines.MySQLPipeline'] 2025-11-14 20:48:17 [scrapy.core.engine] INFO: Spider opened 2025-11-14 20:48:17 [maoyan] INFO: ✅ 成功连接到数据库 movie_analysis 2025-11-14 20:48:17 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min) 2025-11-14 20:48:17 [maoyan] INFO: ✅ 成功获取响应: https://m.maoyan.com/ajax/movieOnInfoList, 状态码: 200 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '花江夏树,鬼头明里,下野纮', 'category': '未知', 'director': '', 'release_date': '2025-11-14', 'score': 9.7, 'source': 'maoyan', 'title': '鬼灭之刃:无限城篇 第一章 猗窝座再袭'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '杰西·艾森伯格,伍迪·哈里森,戴夫·弗兰科', 'category': '未知', 'director': '', 'release_date': '2025-11-14', 'score': 0.0, 'source': 'maoyan', 'title': '惊天魔盗团3'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '陈子平,路扬,董汶亮', 'category': '未知', 'director': '', 'release_date': '2025-08-02', 'score': 9.5, 'source': 'maoyan', 'title': '浪浪山小妖怪'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '李庚希,邓家佳,刘奕铁', 'category': '未知', 'director': '', 'release_date': '2025-10-31', 'score': 0.0, 'source': 'maoyan', 'title': '即兴谋杀'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '黄渤,范丞丞,殷桃', 'category': '未知', 'director': '', 'release_date': '2025-09-30', 'score': 9.6, 'source': 'maoyan', 'title': '浪浪人生'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '艾丽·范宁,迪米特里乌斯·舒斯特-科洛阿玛坦吉,Cameron Brown', 'category': '未知', 'director': '', 'release_date': '2025-11-07', 'score': 9.0, 'source': 'maoyan', 'title': '铁血战士:杀戮之地'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '易烊千玺,舒淇,赵又廷', 'category': '未知', 'director': '', 'release_date': '2025-11-22', 'score': 0.0, 'source': 'maoyan', 'title': '狂野时代'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '山姆·洛克威尔,马克·马龙,奥卡菲娜', 'category': '未知', 'director': '', 'release_date': '2025-08-16', 'score': 9.3, 'source': 'maoyan', 'title': '坏蛋联盟2'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '张子枫,宋佳,朱亚文', 'category': '未知', 'director': '', 'release_date': '2025-09-30', 'score': 9.7, 'source': 'maoyan', 'title': '志愿军:浴血和平'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [maoyan] WARNING: ⚠️ 接收到空或非法 item: {'actors': '绪方惠美,林原惠美,宫村优子', 'category': '未知', 'director': '', 'release_date': '2025-10-31', 'score': 9.4, 'source': 'maoyan', 'title': '天鹰战士:最后的冲击'} 2025-11-14 20:48:17 [scrapy.utils.signal] ERROR: Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x00000232B0F17A10>> Traceback (most recent call last): File "D:\Anaconda\Lib\site-packages\scrapy\utils\defer.py", line 348, in maybeDeferred_coro result = f(*args, **kw) File "D:\Anaconda\Lib\site-packages\pydispatch\robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "D:\Anaconda\Lib\site-packages\scrapy\extensions\feedexport.py", line 572, in item_scraped slot.exporter.export_item(item) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 150, in export_item itemdict = dict(self._get_serialized_fields(item)) File "D:\Anaconda\Lib\site-packages\scrapy\exporters.py", line 65, in _get_serialized_fields item = ItemAdapter(item) File "D:\Anaconda\Lib\site-packages\itemadapter\adapter.py", line 225, in __init__ raise TypeError(f"No adapter found for objects of type: {type(item)} ({item})") TypeError: No adapter found for objects of type: <class 'NoneType'> (None) 2025-11-14 20:48:17 [scrapy.core.engine] INFO: Closing spider (finished) 2025-11-14 20:48:17 [maoyan] INFO: 📦 数据库连接已关闭 2025-11-14 20:48:17 [scrapy.extensions.feedexport] INFO: Stored csv feed (0 items) in: output.csv 2025-11-14 20:48:17 [scrapy.statscollectors] INFO: Dumping Scrapy stats: {'downloader/request_bytes': 462, 'downloader/request_count': 1, 'downloader/request_method_count/GET': 1, 'downloader/response_bytes': 3453, 'downloader/response_count': 1, 'downloader/response_status_count/200': 1, 'elapsed_time_seconds': 0.813219, 'feedexport/success_count/FileFeedStorage': 1, 'finish_reason': 'finished', 'finish_time': datetime.datetime(2025, 11, 14, 12, 48, 17, 967632, tzinfo=datetime.timezone.utc), 'httpcompression/response_bytes': 9172, 'httpcompression/response_count': 1, 'item_scraped_count': 10, 'log_count/ERROR': 10, 'log_count/INFO': 12, 'log_count/WARNING': 10, 'response_received_count': 1, 'scheduler/dequeued': 1, 'scheduler/dequeued/memory': 1, 'scheduler/enqueued': 1, 'scheduler/enqueued/memory': 1, 'start_time': datetime.datetime(2025, 11, 14, 12, 48, 17, 154413, tzinfo=datetime.timezone.utc)} 2025-11-14 20:48:17 [scrapy.core.engine] INFO: Spider closed (finished) PS C:\Users\童琪琪\Desktop\bishe.6\biyesheji.6\movie_analysis_project\movie_analysis_project>
最新发布
11-15
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值