TensorFlow函数:tf.reduce_prod,sum,mean等其他详解

现在tf1.7好像没有找到这个函数,但是却可以用,哎,谷歌API做的有点不专业啊。我翻了以前的文档里面有关于它的介绍:

 

tf.reduce_prod 函数
reduce_prod(
    input_tensor,
    axis=None,
    keep_dims=False,
    name=None,
    reduction_indices=None
)

定义在:tensorflow/python/ops/math_ops.py

请参阅指南:数学函数>减少

此函数计算一个张量的各个维度上元素的乘积。

函数中的input_tensor是按照axis中已经给定的维度来减少的;除非 keep_dims 是true,否则张量的秩将在axis的每个条目中减少1;如果keep_dims为true,则减小的维度将保留为长度1。 

如果axis没有条目,则缩小所有维度,并返回具有单个元素的张量。

参数:

  • input_tensor:要减少的张量。应该有数字类型。
  • axis:要减小的尺寸。如果为None(默认),则将缩小所有尺寸。必须在[-rank(input_tensor), rank(input_tensor))范围内。
  • keep_dims:如果为true,则保留长度为1的缩小维度。
  • name:操作的名称(可选)。
  • reduction_indices:a
gat.attention_2.a grad mean: 0.0000000000000000 gat.attention_3.W grad mean: -0.0001388461387251 gat.attention_3.a grad mean: 0.0000000000000000 gat.out_att.W grad mean: -0.0000622164589004 gat.out_att.a grad mean: 0.0000000000000000 output.weight grad mean: 0.0008497163653374 output.bias grad mean: -0.1250000000000000 ----------------grad---------------------- graph_constructor.node_embeddings grad mean: 0.0000000000000000 graph_constructor.edge_net.0.weight grad mean: 0.0000000000000000 graph_constructor.edge_net.0.bias grad mean: 0.0000000000000001 graph_constructor.edge_net.2.weight grad mean: 0.0000000000000000 graph_constructor.edge_net.2.bias grad mean: 0.0000000006000980 gat.attention_0.W grad mean: -0.0000368939508917 gat.attention_0.a grad mean: 0.0000000000000000 gat.attention_1.W grad mean: -0.0000208595829463 gat.attention_1.a grad mean: 0.0000000000000000 gat.attention_2.W grad mean: -0.0000357754943252 gat.attention_2.a grad mean: 0.0000000000000000 gat.attention_3.W grad mean: -0.0000279349351331 gat.attention_3.a grad mean: 0.0000000000000000 gat.out_att.W grad mean: -0.0000130605521917 gat.out_att.a grad mean: 0.0000000000000000 output.weight grad mean: 0.0002162945747841 output.bias grad mean: -0.0312500000000000 ----------------grad---------------------- graph_constructor.node_embeddings grad mean: 0.0000000000000000 graph_constructor.edge_net.0.weight grad mean: 0.0000000000000000 graph_constructor.edge_net.0.bias grad mean: 0.0000000000000003 graph_constructor.edge_net.2.weight grad mean: 0.0000000000000000 graph_constructor.edge_net.2.bias grad mean: 0.0000000009026457 gat.attention_0.W grad mean: -0.0001302765740547 gat.attention_0.a grad mean: 0.0000000000000000 gat.attention_1.W grad mean: -0.0000974033828243 gat.attention_1.a grad mean: 0.0000000000000000 gat.attention_2.W grad mean: -0.0001679359556874 gat.attention_2.a grad mean: 0.0000000000000000 gat.attention_3.W grad mean: -0.0001393686106894 gat.attention_3.a grad mean: 0.0000000000000000 gat.out_att.W grad mean: -0.0000540103574167 gat.out_att.a grad mean: 0.0000000000000000 output.weight grad mean: 0.0003860194992740 output.bias grad mean: -0.0937500000000000 模型的梯度是这样的
最新发布
08-06
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值