
机器学习
43118
JonyChan技术学习过程中的总结
展开
-
STTN-SpatialTemporalTransformer模型代码
1. Github代码# -*- coding: utf-8 -*-"""Created on Mon Sep 28 10:28:06 2020@author: wb"""import torchimport torch.nn as nnfrom GCN_models import GCNfrom One_hot_encoder import One_hot_encoderclass SSelfAttention(nn.Module): def __init__(self,原创 2021-08-14 20:47:41 · 1192 阅读 · 1 评论 -
图神经网络会用到的相关函数util
1. 对稀疏矩阵的正则化- 对称邻接矩阵的正则化# [A * D^(-1/2)]^T * D^(-1/2) = D^(-1/2) * A * D^(-1/2)def sym_adj(adj): """Symmetrically normalize adjacency matrix.""" # 压缩的邻接矩阵 adj = sp.coo_matrix(adj) rowsum = np.array(adj.sum(1)) # 将n*1的矩阵转换为1个向量原创 2021-07-13 14:15:12 · 1555 阅读 · 1 评论 -
Graph Wavenet:入门图神经网络训练的demo
1. 数据的准备DCRNN的github项目有完整的数据Graph Wavenet提供的数据缺少 '/data/sensor_graph/adj_mx.pkl'原创 2021-07-13 14:08:25 · 2399 阅读 · 4 评论 -
为什么会用到masked loss
序列模型常常会用到padding(nn.functional.pad()),而padding添加的就是0.0。例如: def train(self, input, real_val): self.model.train() self.optimizer.zero_grad() input = nn.functional.pad(input, (1,0,0,0)) output = self.model(input)原创 2021-07-13 13:51:57 · 3166 阅读 · 1 评论 -
From attention import Attention
from tensorflow.keras.layers import Dense, Lambda, Dot, Activation, Concatenatefrom tensorflow.keras.layers import Layerclass Attention(Layer): def __init__(self, units=128, **kwargs): self.units = units super().__init__(**kwargs)原创 2021-04-27 11:34:29 · 1834 阅读 · 2 评论 -
Wx+b与matmul(x,w) + b
原创 2021-04-16 21:14:54 · 229 阅读 · 0 评论 -
RMSE、MAE与MAPE的计算
RMSERMSE = Sqrt(1/N * sigma(vi - v^l)^ 2)MAEMAE = 1/N sigma|vi - v^l|MAPEMAPE = 1/N sigma[ |vi - v^l| / vi ]原创 2021-04-15 19:27:49 · 4944 阅读 · 0 评论 -
SVM的图例学习
SVM的图例学习from sklearn.datasets._samples_generator import make_blobsimport matplotlib.pyplot as pltimport numpy as npX, y = make_blobs(n_samples=200, centers=2, random_state=0, cluster_std=0.80)plt.scatter(X[:,0], X[:,1], c=y, s=50, cmap="autumn")plt原创 2021-04-12 22:03:00 · 267 阅读 · 0 评论 -
Python编写决策树算法
main.pyimport numpy as np# pickle用于进行序列化与反序列化# 序列化过程将文本信息转变为二进制数据流。这样就信息就容易存储在硬盘之中,# 当需要读取文件的时候,从硬盘中读取数据,然后再将其反序列化便可以得到原始的数据。import pickleimport osimport treePlotter# 创建训练数据def CreateTrainingDataset(): X = [[0, 2, 0, 0, 'N'], [0, 2,原创 2021-04-12 22:01:04 · 247 阅读 · 0 评论 -
上游任务upstream task
Downstream taskDownstream tasks is what the field calls those supervised-learning tasks that utilize a pre-trained model or component简而言之,下游任务就是利用预训练的模型在当前数据集的效果Downstream task in CVDownstream Task: Downstream tasks are computer vision applications原创 2021-04-05 09:58:00 · 2266 阅读 · 2 评论 -
VAE与GANs的Tensorflow实现
VAEimport tensorflow as tfclass VariationalAutoencoder(object): def __init__(self, n_input, n_hidden, optimizer = tf.train.AdamOptimizer()): self.n_input = n_input self.n_hidden = n_hidden network_weights = self._initialize_w原创 2021-03-11 08:23:26 · 365 阅读 · 0 评论