tf 2.x keras: Losses,Optimizers,metrics 损失函数优化器度量学习率排程

本文详细介绍了TensorFlow 2.x中Keras模块的主要组件,包括损失函数、优化器及度量指标等内容,并对比了PyTorch的相关功能。重点讨论了CategoricalCrossentropy等损失函数、Adam等优化器及其在深度学习中的应用。
部署运行你感兴趣的模型镜像

tf 2.x keras

Losses,Optimizers,metrics

https://www.tensorflow.org/api_docs/python/tf/keras/

https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/

https://keras.io/api/

 

1.损失函数Losses:

https://www.tensorflow.org/api_docs/python/tf/keras/losses/

https://keras.io/api/losses/

CategoricalCrossentropy,Use this crossentropy loss function when there are two or more label classes. We expect labels to be provided in a one_hot representation.

SparseCategoricalCrossentropy,Use this crossentropy loss function when there are two or more label classes. We expect labels to be provided as integers.

BinaryCrossentropy,Use this cross-entropy loss when there are only two label classes (assumed to be 0 and 1).

KLDivergence  Computes Kullback-Leibler divergence loss between y_true and y_pred.

MeanAbsoluteError: Computes the mean of absolute difference between labels and predictions.

MeanAbsolutePercentageError: Computes the mean absolute percentage error between y_true and y_pred.

MeanSquaredError: Computes the mean of squares of errors between labels and predictions.

MeanSquaredLogarithmicError: Computes the mean squared logarithmic error between y_true and y_pred.

 

2.优化器Optimizers:

https://www.tensorflow.org/api_docs/python/tf/keras/optimizers

https://keras.io/api/optimizers/

class Adadelta: Optimizer that implements the Adadelta algorithm.
class Adagrad: Optimizer that implements the Adagrad algorithm.
class Adam: Optimizer that implements the Adam algorithm.
class Adamax: Optimizer that implements the Adamax algorithm.
class Ftrl: Optimizer that implements the FTRL algorithm.
class Nadam: Optimizer that implements the NAdam algorithm.
class RMSprop: Optimizer that implements the RMSprop algorithm.
class SGD: Gradient descent (with momentum) optimizer.
class Optimizer: Base class for Keras optimizers.

3.学习率排程 Learning Rate Scheduling

Power scheduling
Exponential scheduling
Piecewise constant scheduling
Performance scheduling
1cycle scheduling

 

4.度量Metrics

https://www.tensorflow.org/api_docs/python/tf/keras/metrics/

https://keras.io/api/metrics/

Accuracy metrics

Probabilistic metrics

Regression metrics

Classification metrics based on True/False positives & negatives

==================================

PyTorch

1.Losses:

 

https://pytorch.org/docs/stable/_modules/torch/nn/modules/loss.html

PyTorch Loss Functions: The Ultimate Guide

https://neptune.ai/blog/pytorch-loss-functions

Ultimate Guide To Loss functions In PyTorch With Python Implementation

https://analyticsindiamag.com/all-pytorch-loss-function/

 

2.Optimizers:

 

torch.optim

https://pytorch.org/docs/stable/optim.html?highlight=optimizer#torch.optim.Optimizer

Ultimate guide to PyTorch Optimizers

https://analyticsindiamag.com/ultimate-guide-to-pytorch-optimizers/

 

3.Metrics

 

==================================

优化器(Optimizer)介绍

https://blog.youkuaiyun.com/weixin_41417982/article/details/81561210

AdamW, LAMB: 大型预训练模型常用优化器

https://blog.youkuaiyun.com/weixin_43269174/article/details/106255084

 

 

 

 

您可能感兴趣的与本文相关的镜像

PyTorch 2.5

PyTorch 2.5

PyTorch
Cuda

PyTorch 是一个开源的 Python 机器学习库,基于 Torch 库,底层由 C++ 实现,应用于人工智能领域,如计算机视觉和自然语言处理

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值