I have a 2 gb file I want to read in Java (actually four 2gb files). And so there's a new feature in Java 7 that can let me read all the bytes at once.
import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardOpenOption;
public class reconstructor {
public static void main(String[] args) throws IOException {
Path p = Paths.get("test.txt");
for (int i = 0; i < 1; i++) {
byte[] b = Files.readAllBytes(p);
Files.write(p, b, StandardOpenOption.APPEND);
}
}
}
This is a dumb program that wil read a file with a single byte pre entered in it and continuously read that file and append what it read back onto the same file. Now obviously, the RAM is not big enough to read a 2gb file at one time, let alone four of them, so I was wondering if there was any quick way, without using external libraries (unless that is the only way) to read four files byte by byte so that the RAM does not get overloaded (otherwise I end up with a Java heap error).
解决方案
Reading byte by byte is the other extreme solution, and will be very inefficient. You should simply use a BufferedInputStream, and read the bytes chunk by chunk.
该博客讨论了在Java中如何避免一次性加载大文件导致内存溢出的问题。作者提到,尝试读取一个2GB文件并将其内容追加回同一文件的简单程序会导致Java堆内存错误。解决方案是使用BufferedInputStream以块为单位读取文件,而不是一次性读取所有字节,从而减少内存消耗。这种方法对于处理大文件更为高效且内存友好。
4225

被折叠的 条评论
为什么被折叠?



