
琉璃神社
脚本作品
swy_swy_swy
这个作者很懒,什么都没留下…
展开
专栏收录文章
- 默认排序
- 最新发布
- 最早发布
- 最多阅读
- 最少阅读
-
2019.12.33
2019.12.33#include <vector>#include <string>#include <sstream>using namespace std;vector<string> example(string text) { istringstream iss(text); vector<string> cache; string tmp; while (iss >> tmp) { ca原创 2021-07-21 22:35:52 · 50695 阅读 · 0 评论 -
2019.12.32
2019.12.32#include <vector>#include <string>#include <sstream>using namespace std;vector<string> example(string text) { istringstream iss(text); vector<string> cache; string tmp; while (iss >> tmp) { ca原创 2021-07-21 22:33:45 · 6151 阅读 · 0 评论 -
2019.12.31
2019.12.31#include <vector>#include <string>#include <sstream>using namespace std;vector<string> example(string text) { istringstream iss(text); vector<string> cache; string tmp; while (iss >> tmp) { ca原创 2021-07-21 22:31:45 · 3442 阅读 · 0 评论 -
2019.12.30
2019.12.30#include <vector>#include <string>#include <sstream>using namespace std;vector<string> example(string text) { istringstream iss(text); vector<string> cache; string tmp; while (iss >> tmp) { ca原创 2021-07-21 22:30:30 · 2851 阅读 · 0 评论 -
2019.12.29
2019.12.29#include <vector>#include <string>#include <sstream>using namespace std;vector<string> example(string text) { istringstream iss(text); vector<string> cache; string tmp; while (iss >> tmp) { ca原创 2021-07-21 22:29:44 · 2467 阅读 · 0 评论 -
2019.12.28
2019.12.28#include <vector>#include <string>#include <sstream>using namespace std;vector<string> example(string text) { istringstream iss(text); vector<string> cache; string tmp; while (iss >> tmp) { ca原创 2021-07-21 22:28:45 · 2284 阅读 · 0 评论 -
2019.12.27
2019.12.27#include <vector>#include <string>#include <sstream>using namespace std;vector<string> example(string text) { istringstream iss(text); vector<string> cache; string tmp; while (iss >> tmp) { ca原创 2021-07-21 22:27:07 · 2020 阅读 · 0 评论 -
2019.12.26
2019.12.26#include <vector>#include <string>#include <sstream>using namespace std;vector<string> example(string text) { istringstream iss(text); vector<string> cache; string tmp; while (iss >> tmp) { ca原创 2021-07-21 22:26:11 · 1822 阅读 · 0 评论 -
2019.12.25
2019.12.25#include <vector>#include <string>#include <sstream>using namespace std;vector<string> example(string text) { istringstream iss(text); vector<string> cache; string tmp; while (iss >> tmp) { ca原创 2021-07-21 22:24:41 · 1598 阅读 · 0 评论 -
2019.12.24
2019.12.24b4 = nn.Sequential(Inception(480, 192, (96, 208), (16, 48), 64), Inception(512, 160, (112, 224), (24, 64), 64), Inception(512, 128, (128, 256), (24, 64), 64), Inception(512, 112, (144, 288原创 2021-07-17 21:39:47 · 4937 阅读 · 2 评论 -
2019.12.23
2019.12.23b4 = nn.Sequential(Inception(480, 192, (96, 208), (16, 48), 64), Inception(512, 160, (112, 224), (24, 64), 64), Inception(512, 128, (128, 256), (24, 64), 64), Inception(512, 112, (144, 288原创 2021-07-17 21:38:59 · 1949 阅读 · 0 评论 -
2019.12.22
2019.12.22b4 = nn.Sequential(Inception(480, 192, (96, 208), (16, 48), 64), Inception(512, 160, (112, 224), (24, 64), 64), Inception(512, 128, (128, 256), (24, 64), 64), Inception(512, 112, (144, 288原创 2021-07-17 21:38:14 · 1415 阅读 · 0 评论 -
2019.12.21
2019.12.21b4 = nn.Sequential(Inception(480, 192, (96, 208), (16, 48), 64), Inception(512, 160, (112, 224), (24, 64), 64), Inception(512, 128, (128, 256), (24, 64), 64), Inception(512, 112, (144, 288原创 2021-07-17 21:27:38 · 1344 阅读 · 0 评论 -
2019.12.20
2019.12.20b4 = nn.Sequential(Inception(480, 192, (96, 208), (16, 48), 64), Inception(512, 160, (112, 224), (24, 64), 64), Inception(512, 128, (128, 256), (24, 64), 64), Inception(512, 112, (144, 288原创 2021-07-17 21:26:58 · 1172 阅读 · 0 评论 -
2019.12.19
2019.12.19b4 = nn.Sequential(Inception(480, 192, (96, 208), (16, 48), 64), Inception(512, 160, (112, 224), (24, 64), 64), Inception(512, 128, (128, 256), (24, 64), 64), Inception(512, 112, (144, 288原创 2021-07-17 21:26:14 · 1023 阅读 · 0 评论 -
2019.12.18
2019.12.18bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-11 23:16:45 · 5112 阅读 · 0 评论 -
2019.12.17
2019.12.17bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-11 23:15:08 · 1419 阅读 · 0 评论 -
2019.12.16
2019.12.16bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-11 23:13:43 · 1035 阅读 · 0 评论 -
2019.12.15
2019.12.15bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-11 23:11:50 · 888 阅读 · 0 评论 -
2019.12.14
2019.12.14bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-11 23:10:20 · 906 阅读 · 0 评论 -
2019.12.13
2019.12.13bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-11 23:08:51 · 838 阅读 · 0 评论 -
2019.12.12
2019.12.12bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-11 23:07:01 · 809 阅读 · 0 评论 -
2019.12.11
2019.12.11bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-11 23:05:01 · 741 阅读 · 0 评论 -
2019.12.10
2019.12.10bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-10 23:39:42 · 1114 阅读 · 0 评论 -
2019.12.9
2019.12.9bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-09 23:04:23 · 1002 阅读 · 0 评论 -
2019.12.8
2019.12.8bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-09 23:02:26 · 653 阅读 · 0 评论 -
2019.12.7
2019.12.7bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-09 22:59:41 · 634 阅读 · 0 评论 -
2019.12.6
2019.12.6bool canBeEqual(vector<int>& target, vector<int>& arr) { if (target.size() != arr.size()) { return false; } sort(target.begin(), target.end()); sort(arr.begin(), arr.end()); return target == arr; }原创 2021-07-09 22:50:05 · 523 阅读 · 0 评论 -
2019.12.5
2019.12.5import numpy as npfrom sklearn import metricsy = np.array([1, 1, 2, 2])scores = np.array([0.1, 0.4, 0.35, 0.8])fpr, tpr, thresholds = metrics.roc_curve(y, scores, pos_label=2)原创 2021-07-08 23:22:25 · 813 阅读 · 0 评论 -
2019.12.4
2019.12.4import numpy as npfrom sklearn import metricsy = np.array([1, 1, 2, 2])scores = np.array([0.1, 0.4, 0.35, 0.8])fpr, tpr, thresholds = metrics.roc_curve(y, scores, pos_label=2)原创 2021-07-08 23:18:30 · 521 阅读 · 0 评论 -
2019.12.3
2019.12.3import numpy as npfrom sklearn import metricsy = np.array([1, 1, 2, 2])scores = np.array([0.1, 0.4, 0.35, 0.8])fpr, tpr, thresholds = metrics.roc_curve(y, scores, pos_label=2)原创 2021-07-08 23:13:06 · 477 阅读 · 0 评论 -
2019.12.2
2019.12.2import numpy as npfrom sklearn import metricsy = np.array([1, 1, 2, 2])scores = np.array([0.1, 0.4, 0.35, 0.8])fpr, tpr, thresholds = metrics.roc_curve(y, scores, pos_label=2)原创 2021-07-08 23:08:30 · 516 阅读 · 0 评论 -
2019.12.1
2019.12.1def Laplacian(im): laplacian_kernel = np.array([[0, -1, 0], [-1, 4, -1], [0, -1, 0]]) #{‘reflect’,’constant’,’nearest’,’mirror’, ‘wrap’} conv = nd.convolve(im, laplacian_kernel,mode='constant') return conv...原创 2021-05-06 15:16:15 · 33196 阅读 · 0 评论 -
2019.11.35
2019.11.35def train(net, train_iter, loss, epochs, lr): trainer = torch.optim.Adam(net.parameters(), lr) for epoch in range(epochs): for X, y in train_iter: trainer.zero_grad() l = loss(net(X), y) l.back原创 2021-01-22 21:33:31 · 65097 阅读 · 1 评论 -
2019.11.34
2019.11.34import java.util.Scanner;public class book { public static void main(String[] args) { int sum=0; for(int i=0;i<args.length;i++) sum+=Integer.parseInt(args[i]); System.out.println("The sum is "+sum)原创 2021-01-21 23:06:42 · 9503 阅读 · 0 评论 -
2019.11.33
2019.11.33Before applyingKalman filtering, the false positives for the case of falls were60% of the total classified instances, and after applying Kalmanfiltering were minimized to 33%, respectively. For annotationpurposes, the three movement types we原创 2021-01-21 23:01:12 · 7780 阅读 · 1 评论 -
2019.11.32
2019.11.32b4 = nn.Sequential(Inception(480, 192, (96, 208), (16, 48), 64), Inception(512, 160, (112, 224), (24, 64), 64), Inception(512, 128, (128, 256), (24, 64), 64), Inception(512, 112, (144, 288原创 2021-01-21 15:37:27 · 7309 阅读 · 0 评论 -
2019.11.31
2019.11.31from d2l import torch as d2limport torchfrom torch import nndef vgg_block(num_convs, in_channels, out_channels): layers = [] for _ in range(num_convs): layers.append(nn.Conv2d(in_channels, out_channels,原创 2021-01-21 10:47:29 · 11876 阅读 · 0 评论 -
2019.11.30
2019.11.30from d2l import torch as d2limport torchfrom torch import nnclass Reshape(torch.nn.Module): def forward(self, x): return x.view(-1, 1, 28, 28)net = torch.nn.Sequential( Reshape(), nn.Conv2d(1, 6, kernel_size=5, padding=原创 2021-01-20 20:57:16 · 8528 阅读 · 0 评论 -
2019.11.29
2019.11.29def corr2d_multi_in_out_1x1(X, K): c_i, h, w = X.shape c_o = K.shape[0] X = X.reshape((c_i, h * w)) K = K.reshape((c_o, c_i)) Y = torch.matmul(K, X) # Matrix multiplication in the fully-connected layer原创 2021-01-20 11:44:38 · 4668 阅读 · 1 评论