android jobb 命令,java-JOBB DirectoryFullException:de.waldheinz.fs.fat....

用户在使用JOBB工具打包192个jpg文件时遇到DirectoryFullException错误,提示目录已满,并出现了添加特定文件时的异常。

我正在尝试在包含192个jpg文件(共约70 mb)的目录上使用JOBB工具.当我以jobb -d C:/ sdk / tools / dir / data -k 123456 -o com.nick.app.obb -pn com.nick.app -pv 1运行命令时,它将生成以下日志:

Slop: 0 Directory Overhead: 0

Slop: 189853 Directory Overhead: 24704

Partial Sector [32] writing to sector: 277

Partial Sector [32] writing to sector: 277

Partial Sector [32] writing to sector: 277

Partial Sector [299] writing to sector: 897

Partial Sector [416] writing to sector: 1733

Partial Sector [148] writing to sector: 2385

Partial Sector [95] writing to sector: 3013

Partial Sector [498] writing to sector: 3573

Partial Sector [146] writing to sector: 4061

Partial Sector [427] writing to sector: 4581

Partial Sector [204] writing to sector: 5213

Partial Sector [115] writing to sector: 5769

Partial Sector [69] writing to sector: 6481

Partial Sector [79] writing to sector: 7077

Partial Sector [346] writing to sector: 7661

Partial Sector [93] writing to sector: 8213

Partial Sector [120] writing to sector: 8857

Partial Sector [423] writing to sector: 9461

Partial Sector [4] writing to sector: 10149

Partial Sector [184] writing to sector: 11065

Partial Sector [479] writing to sector: 11921

Partial Sector [83] writing to sector: 12569

Partial Sector [358] writing to sector: 13241

Partial Sector [378] writing to sector: 14009

Partial Sector [366] writing to sector: 14669

Partial Sector [393] writing to sector: 15677

Partial Sector [323] writing to sector: 16385

Partial Sector [236] writing to sector: 16989

Partial Sector [233] writing to sector: 17645

Partial Sector [503] writing to sector: 18345

Partial Sector [348] writing to sector: 19017

Partial Sector [473] writing to sector: 19721

Partial Sector [192] writing to sector: 20345

Partial Sector [398] writing to sector: 20805

Partial Sector [67] writing to sector: 21617

Partial Sector [3] writing to sector: 22437

Partial Sector [315] writing to sector: 23489

Partial Sector [161] writing to sector: 24045

Partial Sector [421] writing to sector: 24569

Partial Sector [465] writing to sector: 25557

Partial Sector [164] writing to sector: 26485

Partial Sector [458] writing to sector: 27177

Partial Sector [412] writing to sector: 28153

Partial Sector [1] writing to sector: 28633

Partial Sector [119] writing to sector: 29441

Partial Sector [367] writing to sector: 30413

Partial Sector [274] writing to sector: 31397

Partial Sector [325] writing to sector: 32369

Partial Sector [355] writing to sector: 33361

Partial Sector [187] writing to sector: 34025

Partial Sector [136] writing to sector: 34829

Partial Sector [157] writing to sector: 35873

Partial Sector [175] writing to sector: 36733

Partial Sector [106] writing to sector: 37673

Partial Sector [79] writing to sector: 38593

Partial Sector [379] writing to sector: 39545

Partial Sector [296] writing to sector: 40517

Partial Sector [440] writing to sector: 41205

Partial Sector [277] writing to sector: 41985

Partial Sector [153] writing to sector: 42609

Partial Sector [484] writing to sector: 43385

Partial Sector [363] writing to sector: 44329

Partial Sector [510] writing to sector: 45097

Partial Sector [296] writing to sector: 46101

Partial Sector [314] writing to sector: 47081

Partial Sector [244] writing to sector: 48073

Partial Sector [187] writing to sector: 48825

Partial Sector [253] writing to sector: 49825

Partial Sector [374] writing to sector: 50833

Partial Sector [508] writing to sector: 51777

Partial Sector [26] writing to sector: 52517

Partial Sector [192] writing to sector: 53385

Partial Sector [137] writing to sector: 54209

Partial Sector [312] writing to sector: 55029

Partial Sector [145] writing to sector: 55829

Partial Sector [394] writing to sector: 56517

Partial Sector [150] writing to sector: 57317

Partial Sector [81] writing to sector: 58197

Partial Sector [198] writing to sector: 59101

Partial Sector [358] writing to sector: 59929

Partial Sector [397] writing to sector: 60729

Partial Sector [142] writing to sector: 61209

Partial Sector [148] writing to sector: 62193

Partial Sector [365] writing to sector: 62833

Partial Sector [93] writing to sector: 63293

Partial Sector [450] writing to sector: 63701

Partial Sector [10] writing to sector: 64485

Partial Sector [354] writing to sector: 64969

Partial Sector [174] writing to sector: 65441

Partial Sector [435] writing to sector: 65961

Partial Sector [64] writing to sector: 66553

Partial Sector [41] writing to sector: 67053

Partial Sector [463] writing to sector: 67489

Partial Sector [177] writing to sector: 68041

Partial Sector [403] writing to sector: 68509

Partial Sector [479] writing to sector: 69305

Partial Sector [248] writing to sector: 69761

Partial Sector [331] writing to sector: 70245

Partial Sector [361] writing to sector: 70745

Partial Sector [56] writing to sector: 71281

Partial Sector [301] writing to sector: 71805

Partial Sector [253] writing to sector: 72629

Partial Sector [461] writing to sector: 73041

Partial Sector [304] writing to sector: 73561

Partial Sector [218] writing to sector: 74105

Partial Sector [147] writing to sector: 74765

Partial Sector [303] writing to sector: 75365

Partial Sector [410] writing to sector: 76321

Partial Sector [280] writing to sector: 77137

Partial Sector [484] writing to sector: 77697

Partial Sector [344] writing to sector: 78465

Partial Sector [189] writing to sector: 79001

Partial Sector [126] writing to sector: 79413

Partial Sector [262] writing to sector: 79929

Partial Sector [187] writing to sector: 80317

Partial Sector [465] writing to sector: 80869

Partial Sector [350] writing to sector: 81313

Partial Sector [236] writing to sector: 81793

Partial Sector [98] writing to sector: 82333

Partial Sector [223] writing to sector: 83209

Partial Sector [1] writing to sector: 83593

Partial Sector [51] writing to sector: 84577

Partial Sector [340] writing to sector: 84993

Partial Sector [377] writing to sector: 85961

Partial Sector [232] writing to sector: 86829

Partial Sector [229] writing to sector: 87253

Partial Sector [337] writing to sector: 88225

Partial Sector [205] writing to sector: 89285

Partial Sector [353] writing to sector: 90089

Partial Sector [289] writing to sector: 90921

Partial Sector [369] writing to sector: 91969

Partial Sector [283] writing to sector: 92741

Partial Sector [267] writing to sector: 93553

Partial Sector [313] writing to sector: 94049

Partial Sector [363] writing to sector: 94521

Partial Sector [415] writing to sector: 95245

Partial Sector [181] writing to sector: 96269

Partial Sector [420] writing to sector: 96733

Partial Sector [251] writing to sector: 97733

Partial Sector [244] writing to sector: 98221

Partial Sector [406] writing to sector: 98669

Partial Sector [226] writing to sector: 99069

Partial Sector [363] writing to sector: 100045

Partial Sector [133] writing to sector: 100769

Partial Sector [461] writing to sector: 101209

Partial Sector [329] writing to sector: 102337

Partial Sector [495] writing to sector: 103201

Partial Sector [452] writing to sector: 104045

Partial Sector [97] writing to sector: 105153

Partial Sector [236] writing to sector: 106177

Partial Sector [10] writing to sector: 106645

Partial Sector [292] writing to sector: 107725

Partial Sector [36] writing to sector: 108725

Partial Sector [500] writing to sector: 109793

Partial Sector [441] writing to sector: 110681

Partial Sector [128] writing to sector: 111329

Partial Sector [293] writing to sector: 112277

Partial Sector [382] writing to sector: 112737

Partial Sector [493] writing to sector: 113229

Partial Sector [256] writing to sector: 113653

Partial Sector [20] writing to sector: 114893

Partial Sector [351] writing to sector: 115905

Partial Sector [110] writing to sector: 116913

Partial Sector [322] writing to sector: 118041

Partial Sector [156] writing to sector: 118589

Partial Sector [232] writing to sector: 119013

Partial Sector [331] writing to sector: 119533

Partial Sector [297] writing to sector: 120509

Partial Sector [310] writing to sector: 121549

Partial Sector [392] writing to sector: 122545

Partial Sector [285] writing to sector: 123485

Partial Sector [108] writing to sector: 124065

de.waldheinz.fs.fat.DirectoryFullException: directory is full

at de.waldheinz.fs.fat.Fat16RootDirectory.changeSize(Fat16RootDirectory.java:109)

at de.waldheinz.fs.fat.AbstractDirectory.addEntries(AbstractDirectory.java:282)

at de.waldheinz.fs.fat.FatLfnDirectory.addFile(FatLfnDirectory.java:139)

at com.android.jobb.Main$1.processFile(Main.java:475)

at com.android.jobb.Main.processAllFiles(Main.java:604)

at com.android.jobb.Main.processAllFiles(Main.java:600)

at com.android.jobb.Main.main(Main.java:417)

Exception in thread "main" java.lang.RuntimeException: Error adding file with name: img178.jpg

at com.android.jobb.Main$1.processFile(Main.java:478)

at com.android.jobb.Main.processAllFiles(Main.java:604)

at com.android.jobb.Main.processAllFiles(Main.java:600)

at com.android.jobb.Main.main(Main.java:417)

有什么建议吗?

### 如何使用MapReduce编写词频统计代码 以下是关于如何编写、编译和打包MapReduce代码以实现词频统计功能的具体说明,涵盖了重写`map`和`reduce`方法的过程,以及在Ubuntu或HDFS环境下的运行方式。 #### 1. MapReduce程序设计 为了完成词频统计任务,需要定义两个主要类:一个是Mapper类,另一个是Reducer类。这两个类分别负责处理输入数据的映射阶段和规约阶段的任务。 - **Mapper类**: 将输入文本分割成单词,并为每个单词生成键值对`(word, 1)`作为中间输出[^1]。 ```java import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Mapper; import java.io.IOException; import java.util.StringTokenizer; public class WordCountMapper extends Mapper<Object, Text, Text, IntWritable> { private final static IntWritable one = new IntWritable(1); private Text word = new Text(); @Override protected void map(Object key, Text value, Context context) throws IOException, InterruptedException { StringTokenizer tokenizer = new StringTokenizer(value.toString().toLowerCase()); while (tokenizer.hasMoreTokens()) { word.set(tokenizer.nextToken()); context.write(word, one); } } } ``` - **Reducer类**: 接收来自Mapper的中间结果,按单词分组并累加计数,最终输出每种单词及其对应的频率[^1]。 ```java import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Reducer; import java.io.IOException; public class WordCountReducer extends Reducer<Text, IntWritable, Text, IntWritable> { @Override protected void reduce(Text key, Iterable<IntWritable> values, Context context) throws IOException, InterruptedException { int sum = 0; for (IntWritable val : values) { sum += val.get(); } context.write(key, new IntWritable(sum)); } } ``` #### 2. Maven配置与依赖管理 要构建一个可执行的JAR包,需设置项目的POM文件以引入必要的Apache Hadoop库。以下是一个基本的Maven `pom.xml` 文件示例: ```xml <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.example</groupId> <artifactId>WordCountMR</artifactId> <version>1.0-SNAPSHOT</version> <dependencies> <!-- Apache Hadoop Core --> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>3.3.1</version> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-mapreduce-client-core</artifactId> <version>3.3.1</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>3.2.4</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> </execution> </executions> </plugin> </plugins> </build> </project> ``` #### 3. 主驱动程序 创建主驱动程序来指定Mapper、Reducer类以及其他参数(如输入路径和输出路径)。该部分还涉及Job对象的初始化和提交操作。 ```java import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; public class WordCountDriver { public static void main(String[] args) throws Exception { Configuration conf = new Configuration(); Job job = Job.getInstance(conf, "Word Count"); job.setJarByClass(WordCountDriver.class); job.setMapperClass(WordCountMapper.class); job.setCombinerClass(WordCountReducer.class); // Optional combiner to optimize performance. job.setReducerClass(WordCountReducer.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); FileInputFormat.addInputPath(job, new Path(args[0])); FileOutputFormat.setOutputPath(job, new Path(args[1])); System.exit(job.waitForCompletion(true) ? 0 : 1); } } ``` #### 4. 编译与打包 通过命令行工具进入项目根目录后,运行以下指令即可完成源码编译及打包工作: ```bash mvn clean package ``` 这将在目标文件夹下生成带有所有必要依赖项的fat JAR文件。 #### 5. 运行流程 假设已安装好Hadoop集群或者本地伪分布式模式,在Linux终端中按照下面步骤上传测试文档至HDFS存储系统后再执行MapReduce作业。 ##### a. 准备输入文件 将待分析的纯文本文件放置于本地磁盘某处,比如 `/home/user/input.txt`. ##### b. 复制到HDFS 登录SSH客户端连接服务器之后,把刚才准备好的资料传送到远程节点上的分布式文件管理系统里去。 ```bash hdfs dfs -mkdir /input hdfs dfs -put /home/user/input.txt /input/ ``` ##### c. 提交任务 最后一步就是实际发起计算请求啦! ```bash hadoop jar target/WordCountMR-1.0-SNAPSHOT.jar com.example.WordCountDriver /input /output ``` 等待一段时间直到整个进程结束为止;如果一切正常的话,则会在指定的目标地址找到汇总后的统计数据记录。 ---
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值