17/08/21 19:57:34 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
17/08/21 19:57:34 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
17/08/21 19:57:34 WARN mapreduce.JobSubmitter: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
17/08/21 19:57:34 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String).
17/08/21 19:57:34 INFO input.FileInputFormat: Total input paths to process : 1
17/08/21 19:57:34 INFO mapreduce.JobSubmitter: number of splits:1
17/08/21 19:57:34 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local1813302185_0001
17/08/21 19:57:34 WARN conf.Configuration: file:/tmp/hadoop-qw song/mapred/staging/qw song1813302185/.staging/job_local1813302185_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
17/08/21 19:57:34 WARN conf.Configuration: file:/tmp/hadoop-qw song/mapred/staging/qw song1813302185/.staging/job_local1813302185_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
17/08/21 19:57:34 WARN conf.Configuration: file:/tmp/hadoop-qw song/mapred/local/localRunner/qw song/job_local1813302185_0001/job_local1813302185_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval; Ignoring.
17/08/21 19:57:34 WARN conf.Configuration: file:/tmp/hadoop-qw song/mapred/local/localRunner/qw song/job_local1813302185_0001/job_local1813302185_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts; Ignoring.
17/08/21 19:57:34 INFO mapreduce.Job: The url to track the job: http://localhost:8080/
17/08/21 19:57:34 INFO mapreduce.Job: Running job: job_local1813302185_0001
17/08/21 19:57:34 INFO mapred.LocalJobRunner: OutputCommitter set in config null
17/08/21 19:57:34 INFO mapred.LocalJobRunner: OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
17/08/21 19:57:34 INFO mapred.LocalJobRunner: Waiting for map tasks
17/08/21 19:57:34 INFO mapred.LocalJobRunner: Starting task: attempt_local1813302185_0001_m_000000_0
17/08/21 19:57:34 INFO util.ProcfsBasedProcessTree: ProcfsBasedProcessTree currently is supported only on Linux.
17/08/21 19:57:35 INFO mapred.Task: Using ResourceCalculatorProcessTree : org.apache.hadoop.yarn.util.WindowsBasedProcessTree@78c45142
17/08/21 19:57:35 INFO mapred.MapTask: Processing split: hdfs://linux:8020/input/wordcount.txt:0+37
17/08/21 19:57:35 INFO mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTaskMapOutputBuffer
17/08/21 19:57:35 INFO mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
17/08/21 19:57:35 INFO mapred.MapTask: mapreduce.task.io.sort.mb: 100
17/08/21 19:57:35 INFO mapred.MapTask: soft limit at 83886080
17/08/21 19:57:35 INFO mapred.MapTask: bufstart = 0; bufvoid = 104857600
17/08/21 19:57:35 INFO mapred.MapTask: kvstart = 26214396; length = 6553600
17/08/21 19:57:35 INFO mapred.LocalJobRunner:
17/08/21 19:57:35 INFO mapred.MapTask: Starting flush of map output
17/08/21 19:57:35 INFO mapred.MapTask: Spilling map output
17/08/21 19:57:35 INFO mapred.MapTask: bufstart = 0; bufend = 49; bufvoid = 104857600
17/08/21 19:57:35 INFO mapred.MapTask: kvstart = 26214396(104857584); kvend = 26214388(104857552); length = 9/6553600
17/08/21 19:57:35 INFO mapred.MapTask: Finished spill 0
17/08/21 19:57:35 INFO mapred.Task: Task:attempt_local1813302185_0001_m_000000_0 is done. And is in the process of committing
17/08/21 19:57:35 INFO mapred.LocalJobRunner: map
17/08/21 19:57:35 INFO mapred.Task: Task ‘attempt_local1813302185_0001_m_000000_0’ done.
17/08/21 19:57:35 INFO mapred.LocalJobRunner: Finishing task: attempt_local1813302185_0001_m_000000_0
17/08/21 19:57:35 INFO mapred.LocalJobRunner: map task executor complete.
17/08/21 19:57:35 INFO mapred.LocalJobRunner: Waiting for reduce tasks
17/08/21 19:57:35 INFO mapred.LocalJobRunner: Starting task: attempt_local1813302185_0001_r_000000_0
17/08/21 19:57:35 INFO util.ProcfsBasedProcessTree: ProcfsBasedProcessTree currently is supported only on Linux.
17/08/21 19:57:35 INFO mapred.Task: Using ResourceCalculatorProcessTree : org.apache.hadoop.yarn.util.WindowsBasedProcessTree@4dbff46f
17/08/21 19:57:35 INFO mapred.ReduceTask: Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@4a052c9e
17/08/21 19:57:35 INFO reduce.MergeManagerImpl: MergerManager: memoryLimit=1303589632, maxSingleShuffleLimit=325897408, mergeThreshold=860369216, ioSortFactor=10, memToMemMergeOutputsThreshold=10
17/08/21 19:57:35 INFO reduce.EventFetcher: attempt_local1813302185_0001_r_000000_0 Thread started: EventFetcher for fetching Map Completion Events
17/08/21 19:57:35 INFO mapred.LocalJobRunner: reduce task executor complete.
17/08/21 19:57:35 WARN mapred.LocalJobRunner: job_local1813302185_0001
java.lang.Exception: org.apache.hadoop.mapreduce.task.reduce.ShuffleShuffleError: error in shuffle in localfetcher#1
at org.apache.hadoop.mapred.LocalJobRunner
Job.runTasks(LocalJobRunner.java:462)atorg.apache.hadoop.mapred.LocalJobRunner
Job.run(LocalJobRunner.java:529)
Caused by: org.apache.hadoop.mapreduce.task.reduce.ShuffleShuffleError: error in shuffle in localfetcher#1
at org.apache.hadoop.mapreduce.task.reduce.Shuffle.run(Shuffle.java:134)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:376)
at org.apache.hadoop.mapred.LocalJobRunnerJob
ReduceTaskRunnable.run(LocalJobRunner.java:319)atjava.util.concurrent.Executors
RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.FileNotFoundException: D:/tmp/hadoop-qw%20song/mapred/local/localRunner/qw%20song/jobcache/job_local1813302185_0001/attempt_local1813302185_0001_m_000000_0/output/file.out.index
at org.apache.hadoop.fs.RawLocalFileSystem.open(RawLocalFileSystem.java:198)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:764)
at org.apache.hadoop.io.SecureIOUtils.openFSDataInputStream(SecureIOUtils.java:156)
at org.apache.hadoop.mapred.SpillRecord.(SpillRecord.java:70)
at org.apache.hadoop.mapred.SpillRecord.(SpillRecord.java:62)
at org.apache.hadoop.mapred.SpillRecord.(SpillRecord.java:57)
at org.apache.hadoop.mapreduce.task.reduce.LocalFetcher.copyMapOutput(LocalFetcher.java:123)
at org.apache.hadoop.mapreduce.task.reduce.LocalFetcher.doCopy(LocalFetcher.java:101)
at org.apache.hadoop.mapreduce.task.reduce.LocalFetcher.run(LocalFetcher.java:84)
17/08/21 19:57:35 INFO mapreduce.Job: Job job_local1813302185_0001 running in uber mode : false
17/08/21 19:57:35 INFO mapreduce.Job: map 100% reduce 0%
17/08/21 19:57:35 INFO mapreduce.Job: Job job_local1813302185_0001 failed with state FAILED due to: NA
17/08/21 19:57:35 INFO mapreduce.Job: Counters: 25
File System Counters
FILE: Number of bytes read=152
FILE: Number of bytes written=234363
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=37
HDFS: Number of bytes written=0
HDFS: Number of read operations=6
HDFS: Number of large read operations=0
HDFS: Number of write operations=2
Map-Reduce Framework
Map input records=3
Map output records=3
Map output bytes=49
Map output materialized bytes=61
Input split bytes=102
Combine input records=0
Spilled Records=3
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=0
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
Total committed heap usage (bytes)=231211008
File Input Format Counters
Bytes Read=37
hadoop词频统计报错,没解决
最新推荐文章于 2023-10-28 15:31:23 发布
