1. 显示hdf文件中的数据集以及属性等等信息
h5disp('文件名.hdf');
例如:
-
h5disp(
'dataset/sample_A_padded_20160501.hdf');
-
-
# 输出:
-
HDF5 sample_A_padded_20160501.hdf
-
Group
'/'
-
Attributes:
-
'file_format': '0.2'
-
Group
'/annotations'
-
Attributes:
-
'offset': 1520.000000 3644.000000 3644.000000
-
Dataset
'ids'
-
Size:
432
-
MaxSize:
432
-
Datatype: H5T_STD_U64LE (uint64)
-
ChunkSize: []
-
Filters: none
-
FillValue:
0
-
Dataset
'locations'
-
Size:
3x432
-
MaxSize:
3x432
-
Datatype: H5T_IEEE_F32LE (
single)
-
ChunkSize: []
-
Filters: none
-
FillValue:
0.000000
-
Dataset
'types'
-
Size:
432
-
MaxSize:
432
-
Datatype: H5T_STRING
-
String Length: variable
-
Padding: H5T_STR_NULLTERM
-
Character
Set: H5T_CSET_UTF8
-
Character Type: H5T_C_S1
-
ChunkSize:
432
-
Filters: deflate(
4)
-
Group
'/annotations/comments'
-
Dataset
'comments'
-
Size:
17
-
MaxSize:
17
-
Datatype: H5T_STRING
-
String Length: variable
-
Padding: H5T_STR_NULLTERM
-
Character
Set: H5T_CSET_UTF8
-
Character Type: H5T_C_S1
-
ChunkSize: []
-
Filters: none
-
Dataset
'target_ids'
-
Size:
17
-
MaxSize:
17
-
Datatype: H5T_STD_U64LE (uint64)
-
ChunkSize: []
-
Filters: none
-
FillValue:
0
-
Group
'/annotations/presynaptic_site'
-
Dataset
'partners'
-
Size:
2x216
-
MaxSize:
2x216
-
Datatype: H5T_STD_U64LE (uint64)
-
ChunkSize: []
-
Filters: none
-
FillValue:
0
-
Group
'/volumes'
-
Dataset
'raw'
-
Size:
3072x3072x200
-
MaxSize:
3072x3072x200
-
Datatype: H5T_STD_U8LE (uint8)
-
ChunkSize:
192x96x7
-
Filters: deflate(
4)
-
Attributes:
-
'resolution': 40.000000 4.000000 4.000000
-
Group
'/volumes/labels'
-
Dataset
'clefts'
-
Size:
1250x1250x125
-
MaxSize:
1250x1250x125
-
Datatype: H5T_STD_U64LE (uint64)
-
ChunkSize:
79x79x4
-
Filters: deflate(
4)
-
Attributes:
-
'resolution': 40.000000 4.000000 4.000000
-
'offset': 1520.000000 3644.000000 3644.000000
-
Dataset
'neuron_ids'
-
Size:
1250x1250x125
-
MaxSize:
1250x1250x125
-
Datatype: H5T_STD_U64LE (uint64)
-
ChunkSize:
79x79x4
-
Filters: deflate(
4)
-
Attributes:
-
'resolution': 40.000000 4.000000 4.000000
-
'offset': 1520.000000 3644.000000 3644.000000
2. 读取hdf文件中的数据
data = h5read('文件名.hdf', '数据集名');
例如:
raw = h5read('dataset/sample_A_padded_20160501.hdf', '/volumes/raw');
3. 读取数据集的属性
attribute = h5readatt('文件名.hdf', '数据集名', '属性名');
例如:
resolution = h5readatt('dataset/sample_A_padded_20160501.hdf', '/volumes/raw', 'resolution');
4. 创建hdf文件
h5create('文件名.hdf', '数据集名', 数据大小, 'Datatype', '数据类型');
例如:
h5create('downs_dataset/downsample_A_padded_20160501.hdf', '/volumes/raw', size_downs_raw, 'Datatype', 'uint8');
5. 往新建的hdf中写入数据
h5write('文件名.hdf', '数据集名', 数据变量);
例如:
h5write('downs_dataset/downsample_A_padded_20160501.hdf', '/volumes/raw', downs_raw);
6. 给数据集写属性
h5writeatt('文件名.hdf', '数据集名', '属性名', 属性变量);
例如:
h5writeatt('downs_dataset/downsample_A_padded_20160501.hdf', '/volumes/raw', 'resolution', resolution);
本文详细介绍HDF5文件的六种关键操作:显示信息、读取数据、读取属性、创建文件、写入数据及写入属性。涵盖从基本查询到高级数据管理的全过程,适合初学者和专业人员参考。
2万+

被折叠的 条评论
为什么被折叠?



