keras Layer

本文介绍了Keras库中的Layer概念,包括简单介绍、核心层、激活层和嵌入层。核心层关注于layer的连接、初始化和形状管理,如set_previous、build方法以及input_shape属性。激活层涉及各种激活函数的应用,如softmax、relu等。而Lambda Layer允许自定义函数计算输出,适用于构建复杂模型。最后,Embedding Layer在处理词汇数据时非常有用,用于将单词转换为连续向量表示。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

keras Layer

Simple Introduction

Keras实现了很多层,包括核心层、卷基层、RNN网络层等诸多常用的网络结构。

Core 核心层

Source

class Layer(object):
    '''Abstract base layer class.


    All Keras layers accept certain keyword arguments:


        trainable: boolean. Set to "False" before model compilation
            to freeze layer weights (they won't be updated further
            during training).
        input_shape: a tuple of integers specifying the expected shape
            of the input samples. Does not includes the batch size.
            (e.g. `(100,)` for 100-dimensional inputs).
        batch_input_shape: a tuple of integers specifying the expected
            shape of a batch of input samples. Includes the batch size
            (e.g. `(32, 100)` for a batch of 32 100-dimensional inputs).
    '''
    def __init__(self, **kwargs):
        allowed_kwargs = {
  
  'input_shape',
                          'trainable',
                          'batch_input_shape',
                          'cache_enabled'}
        for kwarg in kwargs:
            assert kwarg in allowed_kwargs, 'Keyword argument not understood: ' + kwarg
        if 'input_shape' in kwargs:
            self.set_input_shape((None,) + tuple(kwargs['input_shape']))
        if 'batch_input_shape' in kwargs:
            self.set_input_shape(tuple(kwargs['batch_input_shape']))
        if 'trainable' in kwargs:
            self._trainable = kwargs['trainable']
        if not hasattr(self, 'params'):
            self.params = []
        self._cache_enabled = True
        if 'cache_enabled' in kwargs:
            self._cache_enabled = kwargs['cache_enabled']


    @property
    def cache_enabled(self):
        return self._cache_enabled


    @cache_enabled.setter
    def cache_enabled(self, value):
        self._cache_enabled = value


    def __call__(self, X, mask=None, train=False):
        # set temporary input
        tmp_input = self.get_input
        tmp_mask = None
        if hasattr(self, 'get_input_mask'):
            tmp_mask = self.get_input_mask
            self.get_input_mask = lambda _: mask
        self.get_input = lambda _: X
        Y = self.get_output(train=train)
        # return input to what it was
        if hasattr(self, 'get_input_mask'):
            self.get_input_mask = tmp_mask
        self.get_input = tmp_input
        return Y


    def set_previous(self, layer, connection_map={}):
        '''Connect a layer to its parent in the computational graph.
        '''
        assert self.nb_input == layer.nb_output == 1, 'Cannot connect layers: input count and output count should be 1.'
        if hasattr(self, 'input_ndim'):
            assert self.input_ndim == len(layer.output_shape), ('Incompatible shapes: 
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值