hadoop命令---job相关

本文详细介绍了如何在MapReduce程序非正常结束时,通过命令行工具进行后台进程的检查及终止操作。文中提供了多个实用的hadoop命令,帮助读者更好地理解和操作MapReduce任务。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

问题描述:运行MapReduce的程序,当使用Ctrl+C结束(非正常结束)时,命令行工具内虽然结束了,但是后台还在运行,可以通过http://master.hadoop:19888/jobhistory/app 在浏览器中查看到是否还在运行,其中master.hadoop为你的主机别名或者输入你的服务器ip,但是只能看到不能操作,需要使用命令行脚本来操作你的MapReduce的job。

如下命令原地址:https://blog.youkuaiyun.com/lxpbs8851/article/details/12969105

hadoop命令行 与job相关的:

命令行工具
1.查看 Job 信息:
hadoop job -list 
2.杀掉 Job: 
hadoop  job –kill  job_id
3.指定路径下查看历史日志汇总:
hadoop job -history output-dir 
4.作业的更多细节: 
hadoop job -history all output-dir 
5.打印map和reduce完成百分比和所有计数器:
hadoop job –status job_id 
6.杀死任务。被杀死的任务不会不利于失败尝试:
hadoop jab -kill-task <task-id> 
7.使任务失败。被失败的任务会对失败尝试不利:
hadoop job  -fail-task <task-id>
"C:\Program Files\Java\jdk-17\bin\java.exe" -Didea.launcher.port=54222 "-Didea.launcher.bin.path=D:\hadoop\IntelliJ IDEA Community Edition 2018.3.6\bin" -Dfile.encoding=UTF-8 -classpath "D:\hadoop\Hadoop\target\classes;D:\hadoop\hadoop-3.1.4\share\hadoop\client\hadoop-client-api-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\client\hadoop-client-runtime-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\client\hadoop-client-minicluster-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\common\hadoop-kms-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\common\hadoop-nfs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\common\hadoop-common-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\common\hadoop-common-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-nfs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-rbf-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-client-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-httpfs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-rbf-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-client-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-native-client-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-native-client-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-examples-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-app-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-core-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-common-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-shuffle-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-c
最新发布
05-07
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值