1576: Good Sequence
时间限制: 1 Sec 内存限制: 128 MB提交: 12 解决: 5
[ 提交][ 状态][ 讨论版]
题目描述
This is the definition of the good sequence:
Suppose sequence the length which is n is : A1,A2...An. If meet the following condition for all the integer i (2<=i<=n): A[i-1]<=A[i], then the sequence A1, A2, …, An is good sequence.
You are given a weighted directed graph. The vertex are enumerated from 1 to n. The edges are enumerated from 1 to m. For edge i, the length is wi. If a path from the vertex 1 to vertex i (1<=i<=n) meet the following condition: the sequence of length of the path is good sequence, then the path is called good path. The shortest good path from the vertex 1 to vertex i (1<=i<=n) is di. If there is no such shortest path, then di=0. Your task is to find the sum of all the di(1<=i). As the sum can be rather large, you just find the remainder after dividing the sum by 1000000007.
Notice: d1 = 0. The input data guarantee the length of every edge is different from each other.
输入
Your program will be tested on one or more test cases.In each test
case, the first line contains two integers n and m (1<=n,m<=10^5), where n is the number of vertex and m is the number of edges. Following
m lines contain one edge each in form ai, bi and wi(1<=ai,bi<=n,1<=wi<=10^9), indicating that there is a directed edge from the vertex ai to vertex bi
and wi is the length of the edge.
输出
For every test case, print the following line:
answer
where answer is the sum of all the di(1<=i<=n).(mod 10^9+7)
样例输入
4 31 2 62 3 41 4 74 41 2 102 3 111 4 44 3 1
样例输出
1335
提示
来源
#include <iostream>
#include<stdio.h>
#include<string.h>
#include<algorithm>
#define ll long long
using namespace std;
const int maxn=1e5+100;
const int mod=1e9+7;
const ll inf=1e16;
int n,m;
struct node
{
int u;
int v;
ll w;
int next;
}edge[maxn];
int cnt;
int pre[maxn];
ll d[maxn];
void add(int u,int v,int w)
{
edge[cnt].u=u;
edge[cnt].v=v;
edge[cnt].w=w;
cnt++;
}
bool cmp(node a,node b)
{
return a.w<b.w;
}
int main()
{
while(cin>>n>>m)
{
cnt=1;
memset(pre,-1,sizeof(pre));
for(int i=1;i<=m;i++)
{
int a,b,c;
scanf("%d%d%d",&a,&b,&c);
add(a,b,c);
}
sort(edge+1,edge+m+1,cmp);
for(int i=1;i<=n;i++)
d[i]=inf;
d[1]=0;
int i,j,z;
for( i=1;i<=m;i=j+1)
{
for( j=i;j<=m;j++)
if(edge[i].w!=edge[j].w)
break;
j=j-1;
for( z=i;z<=j;)
{
// printf("%d %d\n",edge[i].u,edge[i].v);
if(d[edge[z].u]!=inf&&d[edge[z].u]+edge[z].w<d[edge[z].v])
{
d[edge[z].v]=d[edge[z].u]+edge[z].w;
z=i;
// printf("%d\n",d[edge[i].v]);
}
else
z++;
}
}
ll ans=0;
for(int i=1;i<=n;i++)
{
if(d[i]==inf)
continue;
// printf("%lld\n",d[i]);
ans+=d[i]%mod;
}
printf("%lld\n",ans%mod);
}
}
本文深入探讨了深度学习及其在人工智能领域的应用。涵盖了从基础算法到实际案例的全面解析,包括神经网络、强化学习等核心概念,并展示了它们在计算机视觉、自然语言处理、推荐系统等领域的具体应用。
1万+

被折叠的 条评论
为什么被折叠?



