Skip the Class

Skip the Class

Time Limit: 2000/1000 MS (Java/Others)    Memory Limit: 65536/65536 K (Java/Others)
Total Submission(s): 123    Accepted Submission(s): 73


Problem Description
Finally term begins. luras loves school so much as she could skip the class happily again.(wtf?)

Luras will take n lessons in sequence(in another word, to have a chance to skip xDDDD).

For every lesson, it has its own type and value to skip.

But the only thing to note here is that luras can't skip the same type lesson more than twice.

Which means if she have escaped the class type twice, she has to take all other lessons of this type.

Now please answer the highest value luras can earn if she choose in the best way.
 

Input
The first line is an integer T which indicates the case number.

And as for each case, the first line is an integer n which indicates the number of lessons luras will take in sequence.

Then there are n lines, for each line, there is a string consists of letters from 'a' to 'z' which is within the length of 10,
and there is also an integer which is the value of this lesson.

The string indicates the lesson type and the same string stands for the same lesson type.

It is guaranteed that——

T is about 1000

For 100% cases, 1 <= n <= 100,1 <= |s| <= 10, 1 <= v <= 1000
 

Output
As for each case, you need to output a single line.
there should be 1 integer in the line which represents the highest value luras can earn if she choose in the best way.
 

Sample Input
2 5 english 1 english 2 english 3 math 10 cook 100 2 a 1 a 2
 

Sample Output
115 3
//把相同的课程排序,找最大和次大的
#include<stdio.h>
#include<iostream>
#include<algorithm>
#include<string.h>
using namespace std;
int s[105];
int vis[110];
int b[1000];
int main()
{
    int t;
    scanf("%d",&t);
    while(t--)
    {
        int n;
        char a[100][100];
        scanf("%d",&n);
        int p;
        memset(vis,0,sizeof(vis));
        for(int i=0; i<n; i++)
        {
            scanf("%s%d",a[i],&b[i]);
        }
        int sum=0;
        int we;
        for(int i=0; i<=n-1; i++)
        {
            memset(s,0,sizeof(s));
            p=0;
            if(vis[i]==0)
            {
                s[p++]=b[i];
                for(int j=i+1; j<n; j++)
                {
                    we=0;
                    int l1;
                    l1=strlen(a[i]);
                    int l2;
                    l2=strlen(a[j]);
                    if(l1==l2)
                    {
                    for(int h=0; h<=l1; h++)//看a[i]和a[j]串是否配对,we=0时就配对
                    {
                        if(a[i][h]!=a[j][h])
                        {
                            we=1;
                        }
                    }
                    if(we==0&&vis[j]==0)
                    {
                        s[p++]=b[j];
                        vis[j]=1;
                    }
                    }
                }
                we=0;
                sort(s,s+p);
                sum+=s[p-1];
                sum+=s[p-2];
                vis[i]=1;
            }
        }
        printf("%d\n",sum);
    }
}

### Skip Connections in Neural Networks Explained In neural networks, skip connections refer to additional pathways between different layers that allow the gradient during backpropagation to flow through an alternative route. This mechanism is particularly beneficial for training deep networks where vanishing gradients pose significant challenges. #### Motivation Behind Skip Connections Traditional feedforward architectures like those described in conventional fully connected layers face difficulties when scaled up significantly in depth[^1]. As networks deepen, the problem of vanishing or exploding gradients becomes more pronounced, leading to slower convergence rates and poorer performance on tasks. To mitigate these issues, residual learning frameworks were introduced which incorporate skip connections into model design. #### Implementation Details A typical implementation involves adding direct links from one layer directly to another nonadjacent layer within the network structure. For instance, consider two convolutional blocks separated by several intermediate transformations; instead of passing data solely forward sequentially, part of it gets routed straight across via a shortcut connection: ```python import torch.nn as nn class ResidualBlock(nn.Module): def __init__(self, in_channels, out_channels): super(ResidualBlock, self).__init__() # Main processing path including convolutions etc. self.main_path = nn.Sequential( nn.Conv2d(in_channels, out_channels, kernel_size=3, padding=1), nn.BatchNorm2d(out_channels), nn.ReLU(), nn.Conv2d(out_channels, out_channels, kernel_size=3, padding=1), nn.BatchNorm2d(out_channels) ) # Shortcut/skip connection self.shortcut = nn.Identity() if in_channels == out_channels else \ nn.Conv2d(in_channels, out_channels, kernel_size=1) def forward(self, x): identity = self.shortcut(x) output = self.main_path(x) return nn.functional.relu(output + identity) ``` This approach effectively modifies what needs to be learned by subsequent layers since rather than trying to approximate some desired mapping \( H(x) \), each stacked transformation attempts to fit a residual function \( F(x)=H(x)-x \). #### Benefits Provided By Skip Connections The introduction of such architectural elements facilitates better optimization properties while enabling deeper models without sacrificing accuracy. Moreover, experiments show improved generalization capabilities due partly because features extracted earlier remain preserved throughout later stages thanks to these auxiliary paths. --related questions-- 1. How do batch normalization techniques complement the use of skip connections? 2. Can you provide examples of state-of-the-art architectures utilizing skip connections besides ResNets? 3. What modifications should be considered when implementing skip connections in recurrent neural networks? 4. Are there any drawbacks associated with using extensive skip connections in very large scale networks?
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值