de 1
hello 3
hua 1
kit 3
liu 1
liudehua 1
min 2
shu 2
tom 3
wang 2
WARN - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
INFO - session.id is deprecated. Instead, use dfs.metrics.session-id
INFO - Initializing JVM Metrics with processName=JobTracker, sessionId=
WARN - Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
WARN - No job jar file set. User classes may not be found. See Job or Job#setJar(String).
INFO - Total input paths to process : 1
INFO - number of splits:1
INFO - Submitting tokens for job: job_local1821357918_0001
INFO - The url to track the job: http://localhost:8080/
INFO - Running job: job_local1821357918_0001
INFO - OutputCommitter set in config null
INFO - OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
INFO - Waiting for map tasks
INFO - Starting task: attempt_local1821357918_0001_m_000000_0
INFO - ProcfsBasedProcessTree currently is supported only on Linux.
INFO - Using ResourceCalculatorProcessTree : org.apache.hadoop.yarn.util.WindowsBasedProcessTree@fdd3e
INFO - Processing split: file:/D:/hadoop/input/文件1.txt:0+122
INFO - (EQUATOR) 0 kvi 26214396(104857584)
INFO - mapreduce.task.io.sort.mb: 100
INFO - soft limit at 83886080
INFO - bufstart = 0; bufvoid = 104857600
INFO - kvstart = 26214396; length = 6553600
INFO - Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
key:0value:hello hello hello tom tom
INFO - Job job_local1821357918_0001 running in uber mode : false
INFO - map 0% reduce 0%
INFO - map > map
INFO - map 18% reduce 0%
INFO - map > map
INFO - map > map
key:33value:kit kit kit wang shu min
key:66value:wang shu min tom
key:89value:
key:91value:liu de hua
INFO - map > map
INFO - map 58% reduce 0%
key:107value:liudehua
key:118value:
INFO - map > map
key:120value:
INFO - map 66% reduce 0%
INFO - map > map
INFO - Starting flush of map output
INFO - Spilling map output
INFO - bufstart = 0; bufend = 264; bufvoid = 104857600
INFO - kvstart = 26214396(104857584); kvend = 26214244(104856976); length = 153/6553600
INFO - Finished spill 0
INFO - Task:attempt_local1821357918_0001_m_000000_0 is done. And is in the process of committing
INFO - map
INFO - Task 'attempt_local1821357918_0001_m_000000_0' done.
INFO - Finishing task: attempt_local1821357918_0001_m_000000_0
INFO - map task executor complete.
INFO - Waiting for reduce tasks
INFO - Starting task: attempt_local1821357918_0001_r_000000_0
INFO - ProcfsBasedProcessTree currently is supported only on Linux.
INFO - Using ResourceCalculatorProcessTree : org.apache.hadoop.yarn.util.WindowsBasedProcessTree@11c95c0
INFO - Using ShuffleConsumerPlugin: org.apache.hadoop.mapreduce.task.reduce.Shuffle@c821ea
INFO - MergerManager: memoryLimit=181665792, maxSingleShuffleLimit=45416448, mergeThreshold=119899424, ioSortFactor=10, memToMemMergeOutputsThreshold=10
INFO - attempt_local1821357918_0001_r_000000_0 Thread started: EventFetcher for fetching Map Completion Events
INFO - localfetcher#1 about to shuffle output of map attempt_local1821357918_0001_m_000000_0 decomp: 344 len: 348 to MEMORY
INFO - Read 344 bytes from map-output for attempt_local1821357918_0001_m_000000_0
INFO - closeInMemoryFile -> map-output of size: 344, inMemoryMapOutputs.size() -> 1, commitMemory -> 0, usedMemory ->344
INFO - EventFetcher is interrupted.. Returning
INFO - 1 / 1 copied.
INFO - finalMerge called with 1 in-memory map-outputs and 0 on-disk map-outputs
INFO - Merging 1 sorted segments
INFO - Down to the last merge-pass, with 1 segments left of total size: 341 bytes
INFO - Merged 1 segments, 344 bytes to disk to satisfy reduce memory limit
INFO - Merging 1 files, 348 bytes from disk
INFO - Merging 0 segments, 0 bytes from memory into reduce
INFO - Merging 1 sorted segments
INFO - Down to the last merge-pass, with 1 segments left of total size: 341 bytes
INFO - 1 / 1 copied.
INFO - mapred.skip.on is deprecated. Instead, use mapreduce.job.skiprecords
INFO - map 100% reduce 0%
INFO - reduce > reduce
INFO - map 100% reduce 68%
key:-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
INFO - reduce > reduce
INFO - map 100% reduce 74%
INFO - reduce > reduce
INFO - reduce > reduce
INFO - map 100% reduce 81%
INFO - reduce > reduce
INFO - map 100% reduce 82%
INFO - reduce > reduce
key:de-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
key:hello-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
key:hua-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
key:kit-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
key:liu-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
INFO - reduce > reduce
key:liudehua-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
INFO - map 100% reduce 92%
INFO - reduce > reduce
INFO - map 100% reduce 93%
INFO - reduce > reduce
key:min-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
key:shu-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
key:tom-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
key:wang-----------Iterable<IntWritable>:org.apache.hadoop.mapreduce.task.ReduceContextImpl$ValueIterable@8a1fc3
INFO - Task:attempt_local1821357918_0001_r_000000_0 is done. And is in the process of committing
INFO - reduce > reduce
INFO - Task attempt_local1821357918_0001_r_000000_0 is allowed to commit now
INFO - Saved output of task 'attempt_local1821357918_0001_r_000000_0' to file:/D:/hadoop/output3/_temporary/0/task_local1821357918_0001_r_000000
INFO - reduce > reduce
INFO - Task 'attempt_local1821357918_0001_r_000000_0' done.
INFO - Finishing task: attempt_local1821357918_0001_r_000000_0
INFO - reduce task executor complete.
INFO - map 100% reduce 100%
INFO - Job job_local1821357918_0001 completed successfully
INFO - Counters: 33
File System Counters
FILE: Number of bytes read=1276
FILE: Number of bytes written=514743
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
Map-Reduce Framework
Map input records=8
Map output records=39
Map output bytes=264
Map output materialized bytes=348
Input split bytes=98
Combine input records=0
Combine output records=0
Reduce input groups=11
Reduce shuffle bytes=348
Reduce input records=39
Reduce output records=11
Spilled Records=78
Shuffled Maps =1
Failed Shuffles=0
Merged Map outputs=1
GC time elapsed (ms)=23
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
Total committed heap usage (bytes)=242360320
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=122
File Output Format Counters
Bytes Written=83
698

被折叠的 条评论
为什么被折叠?



