without base relocation /fixed:no

本文介绍了如何解决在使用IntelVtune进行程序性能分析时遇到的一个错误,并提供了Vtune的基本使用指南。对于初学者而言,Vtune操作非常直观,几乎不需要设置太多参数即可开始分析程序性能。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

原文地址:http://blog.sina.com.cn/s/blog_5f3759a00100e118.html

在使用Intel Vtune 软件进行程序性能分析时,可能会遇到下面一个错误:

without <wbr>base <wbr>relocation <wbr> <wbr> <wbr>/fixed:no

The following modules were created without base relocation:F:\……\***.exe

Base relocations are required in order to obtain call graph data.

If your application is developed using

Microsoft Visual C++ or Microsoft visual Bsic,

add the flag "/fixed:no" to the link command line

or set the environment variable "LINK=/fixed:no"

Restart the Development Environment and rebuild your application.

 

既然提示都这么详细了,那下面就按照这个步骤来就是了:

我用的是vs2005,在Project -> properties -> Configuration properties ->Linker ->Command Line -> Additonal options 目录下,输入/fixed:no ,点击应用,然后重新生成解决方案,重新Load ***.exe文件就OK了。

without <wbr>base <wbr>relocation <wbr> <wbr> <wbr>/fixed:no

 

 

对初级使用者来说,Vtune使用起来还是很傻瓜的,几乎不需要设置太多东西,只要新建一个Quick Performance Analyzer Project ,然后加载可执行文件 **.exe 就可以了。然后生成一堆图表和数据,那就是我们想要的东西了……

 

卡在这不动huangjingying@spark059:~$ flume-ng agent -n a1 -c $FLUME_HOME/conf/ -f /home/huangjingying/dw059/flume/file_to_kafka_059.conf -Dflume.root.logger=info,console Info: Including Hadoop libraries found via (/usr/local/hadoop/bin/hadoop) for HDFS access Info: Including HBASE libraries found via (/usr/local/hbase/bin/hbase) for HBASE access Info: Including Hive libraries found via (/usr/local/hive) for Hive access + exec /usr/local/java/bin/java -Xmx20m -Dflume.root.logger=info,console -cp '/opt/flume/apache-flume-1.11.0-bin/conf:/opt/flume/apache-flume-1.11.0-bin/lib/*:/usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/usr/local/hadoop/share/hadoop/yarn:/usr/local/hadoop/share/hadoop/yarn/lib/*:/usr/local/hadoop/share/hadoop/yarn/*:/usr/local/hbase/conf:/usr/local/java/lib/tools.jar:/usr/local/hbase:/usr/local/hbase/lib/shaded-clients/hbase-shaded-client-2.4.15.jar:/usr/local/hbase/lib/client-facing-thirdparty/audience-annotations-0.5.0.jar:/usr/local/hbase/lib/client-facing-thirdparty/commons-logging-1.2.jar:/usr/local/hbase/lib/client-facing-thirdparty/htrace-core4-4.2.0-incubating.jar:/usr/local/hbase/lib/client-facing-thirdparty/reload4j-1.2.22.jar:/usr/local/hbase/lib/client-facing-thirdparty/slf4j-api-1.7.33.jar:/usr/local/hbase/conf:/usr/local/hive/lib/*' -Djava.library.path=:/usr/local/hadoop/lib/native:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib org.apache.flume.node.Application -n a1 -f /home/huangjingying/dw059/flume/file_to_kafka_059.conf SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/flume/apache-flume-1.11.0-bin/lib/log4j-slf4j-impl-2.18.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/hadoop/share/hadoop/common/lib/slf4j-reload4j-
最新发布
04-01
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值