keras中的ResNet-50和SENet结合。

本文深入解析了SENet与ResNet在网络架构上的融合应用,详细介绍了如何通过全局平均池化、全连接层和激活函数等操作实现特征的挤压与激励,以及如何将此机制整合到ResNet中,提升模型的表达能力。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

参考caffe:https://github.com/hujie-frank/SENet

def expand_dim_backend(self,x):
    x1 = K.reshape(x,(-1,1,256))
    print('x1:',x1)
    return x1
def multiply(self,a):
    x = np.multiply(a[0], a[1])
    print('x:',x)
    return x

def make_net_Res(self, encoding):
    # Input
    x = ZeroPadding1D(padding=3)(encoding)
    x = Conv1D(filters=64, kernel_size=7, strides=2, padding='valid', activation='relu')(x)
    x = BatchNormalization(axis=1, scale=True)(x)
    x_pool = MaxPooling1D(pool_size=3, strides=2, padding='same')(x)



 #RESNet_1
    x = Conv1D(filters=128, kernel_size=1, strides=1, padding='valid', activation='relu')(x_pool)
    x = BatchNormalization(axis=1, scale=True)(x)
    
    x = Conv1D(filters=128, kernel_size=3, strides=1, padding='valid', activation='relu')(x)
    x = BatchNormalization(axis=1, scale=True)(x)
    
    RES_1 = Conv1D(filters=256, kernel_size=1, strides=1, padding='valid', activation='relu')(x)
    x = BatchNormalization(axis=1, scale=True)(RES_1)

# SENet
squeeze = GlobalAveragePooling1D()(x)
squeeze = Lambda(self.expand_dim_backend)(squeeze)

excitation = Conv1D(filters=16, kernel_size=1, strides=1, padding='valid', activation='relu')(squeeze)
excitation = Conv1D(filters=256, kernel_size=1, strides=1, padding='valid', activation='sigmoid')(excitation)

x_pool_1 = Conv1D(filters=256, kernel_size=1, strides=1, padding='valid', activation='relu')(x_pool)
x_pool_1 = BatchNormalization(axis=1, scale=True)(x_pool_1)

scale = Lambda(self.multiply)([RES_1, excitation])
res_1 = Concatenate(axis=1)([x_pool_1, scale])

#RESNet_2
x = Conv1D(filters=128, kernel_size=1, activation='relu')(res_1)
x = BatchNormalization(axis=1, scale=True)(x)

x = Conv1D(filters=128, kernel_size=3,  activation='relu')(x)
x = BatchNormalization(axis=1, scale=True)(x)

RES_2 = Conv1D(filters=256, kernel_size=1)(x)

# SENet
squeeze = GlobalAveragePooling1D()(RES_2)
squeeze = Lambda(self.expand_dim_backend)(squeeze)

excitation = Conv1D(filters=16, kernel_size=1, strides=1, padding='valid', activation='relu')(squeeze)
excitation = Conv1D(filters=256, kernel_size=1, strides=1, padding='valid', activation='sigmoid')(excitation)

scale = Lambda(self.multiply)([RES_2, excitation])
x = Concatenate(axis=1)([res_1, scale])

x = GlobalMaxPooling1D()(x)
print('x:', x)
output = Dense(1, activation='sigmoid')(x)
return (output)
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值