Keras自定义loss
在compile方法中选择损失(目标)函数
查询Keras.io可知,在model.compile方法中选择loss的方法有两种:
#1
model.compile(loss='mean_squared_error', optimizer='sgd')
#2
from keras import losses
model.compile(loss=losses.mean_squared_error, optimizer='sgd')
可用的loss种类在这里,大致有十几种可选。
自定义loss
如果这些loss不够用就只能自己设计loss了,查询了一下keras、losses.py发现写法很简单:
def mean_squared_error(y_true, y_pred):
return K.mean(K.square(y_pred - y_true), axis = -1)
那么也就是说我们只需要按照这个mse的方式定义自己的loss就行了。
但对于模型有多个输出,需要组成一个复杂的多项式loss的情况,keras也给出了解决方法,
看了compile方法之后发现有糖:
def compile(self, optimizer,
loss=None,
metrics=None,
loss_weights=None,
sample_weight_mode=None,
weighted_metrics=None,
target_tensors=None,
**kwargs):
"""Configures the model for training.
# Arguments
optimizer: String (name of optimizer) or optimizer instance.
See [optimizers](/optimizers).
loss: String (name of objective function) or objective function or
`Loss` instance. See [losses](/losses).
If the model has multiple outputs, you can use a different loss
on each output by passing a dictionary or a list of losses.
The loss value that will be minimized by the model
will then be the sum of all individual losses.
......
"""
可以用字典或者列表对于每一个输出分别选择loss,问题解决!