Java API获取HDFS的文件信息
1).获取文件属性
环境:Windows Java API
函数:mkdir、FileStatus[]、listStatus、isDirectory
@Test
public void test1() throws Exception{
//配置HDFS主节点:NameNode
Configuration conf = new Configuration();
conf.set("fs.defaultFS","hdfs://192.168.142.111:9000");
//得到HDFS客户端
FileSystem fs = FileSystem.get(conf);
//创建文件夹 /tools/foldder
fs.mkdir(new Path("/tools/folder"));
//获取文件状态
FileStatus[] list = fs.listStatus(new Path("/tools"));
for(FileStatus status:list){
//获取文件属性
System.out.println(status.isDirectory?"目录":"文件");
}
}
执行结果
目录
2).获取数据块信息
环境:Windows Java API
函数:getFileStatus、getFileBlockLocations、getHosts、getNames
@Test
public void test2()throws Exception{
Configuration conf = new Configuration();
conf.set("fs.defaultFS", "hdfs://192.168.142.111:9000");
//获取某个文件的数据块信息
FileSystem fs = FileSystem.get(conf);
FileStatus fileStatus = fs.getFileStatus(new Path("/tools/up1.mp4"));
BlockLocation[] list = fs.getFileBlockLocations(fileStatus,0,fileStatus.getLen());
for(BlockLocation bl:list){
System.out.println(Arrays.toString(bl.getHosts));
System.out.println(Arrays.toString(bl.getNames));
}
Java API获取HDFS的文件上传/下载
1.0文件上传原理1.1).上传文件至HDFS(Java IO)
环境:Windows Java API
函数:InputStream、FileInputStream、OutputStream、create
@Test
public void upload1() throws Exception {
//1.配置HDFS
Configuration conf = new Configuration();
conf.set("fs.defaultFS","hdfs://192.168.142.111:9000");
//2.获取客户端
FileSystem fs = FileSystem.get(conf);
//3.打开输入流
InputStream in = new FileInputStream("E:\\java Hadoop\\upfile.mp4");
//4.创建输出端
OutputStream out = fs.create(new Path("/tools/up1.mp4"));
//5.创建缓存区
byte[] buffer = new byte[1024];
int len= 0;
//6.上传文件
while((len = read(buffer))>0){
out.write(out,0,len);
}
flush();
//7.关闭流
in.close();
out.close();
}
1.2).上传文件至HDFS(内置函数)
环境:Windows Java API
函数:InputStream、FileInputStream、OutputStream、create、IOUtils、copyBytes
@Test
public void upload2()throws Exception{
//1.配置HDFS的主节点:NameNode
Configuration conf = new Configuration();
conf.set("fs.defaultFS","hdfs://192.168.142.111:9000");
//2.获取客户端
FileSystem fs = FileSystem.get(conf);
//3.打开输入流
InputStream input = new FileInputStream("E:\\java Hadoop\\upload1.mp4");
//4.创建输出流
OutputStream output = fs.create(new Path("/tools/up2.mp4"));
//5.copyByte
IOUtils.copyBytes(input,output,1024);
}
2.1).下载文件至Windows(Java IO) HDFS文件下载原理
环境:Windows Java API
函数:InputStream、FileInputStream、OutputStream、create、IOUtils、copyBytes
@Test
public void download1()throws Exception{
//1.配置HDFS的主节点:NameNode
Configuration conf = new Configuration();
conf.set("fs.defaultFS","hdfs://192.168.142.111:9000");
//2.获取客户端
FileSystem fs = FileSystem.get(conf);
//3.打开输入流(HDFS)
InputStream in = fs.open(new Path("/tools/up1.mp4"));
//4.创建输出流
OutputStream out = new FileOutput("E:\\java Hadoop\\down1.mp4");
//5.创建缓存区
byte[] buffer = new byte[1024];
int len = 0;
//6.写入输出流
while((len = read(buffer))>0){
out.write(buffer,0,len);
}
//7.关闭流
in.close();
out.close();
}
2.2).下载文件至Windows(内置函数)
环境:Windows Java API
函数:InputStream、FileInputStream、OutputStream、create、IOUtils、copyBytes
@Test
public void download()throws Expection{
//1.配置HDFS的NameNode
Configuration conf = new Configuration();
conf.set("fs.defaultFS","hdfs://192.168.142.111:9000");
//2获取客户端
FileSystem fs = FileSystem.get(conf);
//3.打开输入流
InputStream input = fs.open(new Path("/tools/up1.mp4"));
//4.创建输出流
OutputStream output = new FileOutput("E:\\java Hadoop\\down2.mp4");
//5.文件下载
IOUtils.copyBytes(input,output,1024);
}