org.apache.hadoop.conf-Configured

深入探讨Apache Hadoop配置类Configured与Configuration的使用,解释如何设置与获取配置,适用于大数据处理领域的开发者。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >


org.apache.hadoop.conf中的最后一个类,也是这个包中以后用的最频繁的一个,Configurable算是肉体,Configuration算是灵魂吧


 1 package org.apache.hadoop.conf;
 2 
 3 /** Base class for things that may be configured with a {@link Configuration}. */
 4 public class Configured implements Configurable {
 5 
 6   private Configuration conf;
 7 
 8   /** Construct a Configured. */
 9   public Configured() {
10     this(null);
11   }
12   
13   /** Construct a Configured. */
14   public Configured(Configuration conf) {
15     setConf(conf);
16   }
17 
18   // inherit javadoc
19   public void setConf(Configuration conf) {
20     this.conf = conf;
21   }
22 
23   // inherit javadoc
24   public Configuration getConf() {
25     return conf;
26   }
27 
28 }

整个代码没什么好解释的。

就一个//inherit javadoc让我顿了一下,后来知道了,是继承了接口的注释

1  /** Set the configuration to be used by this object. */
2   void setConf(Configuration conf);
3 
4   /** Return the configuration used by this object. */
5   Configuration getConf();

 

转载于:https://www.cnblogs.com/admln/p/Configured.html

package com.dajiangtai.hadoop.tv; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.conf.Configured; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.util.GenericOptionsParser; import org.apache.hadoop.util.Tool; import org.apache.hadoop.util.ToolRunner; import java.io.IOException; public class ParseAndFilterLog extends Configured implements Tool { public static class ExtractTVMsgLogMapper extends Mapper<LongWritable, Text, Text, Text> { private final Text outputKey = new Text(); private final Text outputValue = new Text(); @Override public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException { String[] parsedData = parseAndFilterData(value.toString()); if (parsedData != null && parsedData.length == 7) { String keyPart = parsedData[0] + "@" + parsedData[1]; String valuePart = String.join("@", parsedData[2], parsedData[3], parsedData[4], parsedData[5], parsedData[6]); outputKey.set(keyPart); outputValue.set(valuePart); context.write(outputKey, outputValue); } } private String[] parseAndFilterData(String data) { // 实际解析逻辑应在此实现 return data.split(","); // 示例:按逗号分割 } } @Override public int run(String[] args) throws Exception { Configuration conf = getConf(); String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs(); if (otherArgs.length < 2) { System.err.println("Usage: ParseAndFilterLog <input> <output>"); return 1; } Job job = Job.getInstance(conf, "TV Log Parser"); job.setJarByClass(ParseAndFilterLog.class); job.setMapperClass(ExtractTVMsgLogMapper.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(Text.class); job.setNumReduceTasks(0); // 无Reducer FileInputFormat.addInputPath(job, new Path(otherArgs[0])); FileOutputFormat.setOutputPath(job, new Path(otherArgs[1])); return job.waitForCompletion(true) ? 0 : 1; } public static void main(String[] args) throws Exception { int exitCode = ToolRunner.run(new ParseAndFilterLog(), args); System.exit(exitCode); } }import org.apache.hadoop.fs.Path;import org.apache.hadoop.util.GenericOptionsParser; import org.apache.hadoop.util.Tool; import org.apache.hadoop.util.ToolRunner; 这几个饮用能不能去除,因为总是报错,你帮我修改一下
最新发布
07-02
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值