hadoop-2.7.2+hive 2.0.0采用LZO压缩的坑好深

本文分享了作者在Hadoop 2.7.2版本下的实践经验,包括SSH免密码登录的简化方法、LZO和LZOP的高效安装方式、Hadoop使用LZO压缩的问题及解决方案、hive-2.0.0的新特性,以及bzip2与LZO压缩效果对比。

hadoop-2.7.2都出来了,话说可以在生产环境应用了,一通体验下来,让人要崩溃了!

我前2年写了几篇Hadoop-2.2.0+QJM HA + hive + LZO的博客,到目前为止在hadoop-2.7.2版本下依然是那么正确!呵呵,废话一把!

Hadoop2.2.0使用之初体验

Hadoop2.2.0基于QJM的HA实践

Hadoop2.2.0+hive使用LZO压缩那些事

上述(1)、(2)基本没有变化,(3)也没啥变化,我倒是在(3)上栽大跟头了,3天下来就是想不通,怎么Hive下就是LZO文件不分片,虽说最后看来还是自己太2了,总之就在这里记述下,下不为例。


一、SSH免密码登录

这个本来比较烦人的,步骤简单,但是机器一多久让人发毛了,毕竟没有自动化啊!后来就想了个办法,在各机器上都执行:ssh-keygen -t rsa,再把各机器上生成的id_rsa.pub文件都下载下来,在windows的文本编辑器下完成合并,在把合并后的文件又上传到各Hadoop节点上,作为authorized_keys,授予644权限。

话说为了避免ssh 的第一次的时候,那种恼人的步骤出现,也可以将各节点的known_hosts 文件也用同上办法来搞定。

话说,这种做法绝对是省时间、降血压的大招。

二、LZO和LZOP的安装

现在网上的各种Hadoop安装教程,基本都是类似下面这样的千人一面了。

安装LZO

wget http://www.oberhumer.com/opensource/lzo/download/lzo-2.06.tar.gz
tar -zxvf lzo-2.06.tar.gz 
./configure -enable-shared -prefix=/usr/local/hadoop/lzo/
make && make test && make install

安装LZOP
wget http://www.lzop.org/download/lzop-1.03.tar.gz
tar -zxvf lzop-1.03.tar.gz 
./configure -enable-shared -prefix=/usr/local/hadoop/lzop
make  && make install

烦不烦啊,各种Linux版本不是都有自己的package管理工具,比如centos 下的yum,为毛还要自己去找源代码,自己去编译呢?

直接yum install lzo lzop -y 不就什么都搞定了吗!!!!


三、Hadoop用LZO压缩

LZO这东西,和bzip2等还真不一样,没法知道靠谱不靠谱,也说不定Linux下直接执行命令,可以压缩和解压缩,但是就在Hadoop中不工作。

话说,肯定要保证在Linux上直接可以用lzop来压缩与解压缩才行,这个是基础。当然这个不全对,至于为什么不对,先就不说了!

直接把hive-2.0.0套上后,建表,加载数据等,再就是直接select count(*) from xxx,能有个数字结果的话,好歹hadoop上的LZO是支持了。


四、hive-2.0.0变天了

我就是栽倒在这里了。一句话,hive-2.0.0没法应用到LZO文件的分片,无论多大的文件,都还是当成一个再来该干嘛干嘛!话说,hadoop-lzo.jar根本就没有被用到,当然没有这个,提交JOB的时候就会错了,但是真的等到JOB+TASK都跑的时候,LZO文件就不分片了。

hive-2.0.0相比之前的版本已经大变样 了!


五、hadoop的压缩

本次就是比较了一下bzip2 、LZO两种。

LZO真的压缩和解压缩好快好快,毛病就是压缩后的文件还是很大,以nginx的access.log来计算的话,压缩率介于1/3 ~1/2之间吧,一个12G的文件,压缩后为5G上线,时间在20秒上下。

bzip2的好处就是压缩率高,12G的文件,LZO后5G,bzip2后2G,不过别看它压缩率高,也支持分片,但是时间上会让发狂的。12G的nginx的访问日志文件,压缩一把要25分钟左右,解压缩要1.5分钟左右,和LZO没有比较余地了。

可以考虑把半年以前的数据都bzip2来压缩,毕竟这个用的少,压缩率也高,就算少量的用也能应付,不就是时间长点而已。LZO就存那种计算多,数据新的业务了。

libs/jline-2.12.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-lang3-3.3.2.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-logging-1.1.3.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-yarn-server-applicationhistoryservice-2.6.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-shims-scheduler-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/zookeeper-3.4.6.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jackson-mapper-asl-1.9.13.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/javax.inject-1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-yarn-common-2.7.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-shims-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/parquet-format-structures-1.12.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-codec-1.4.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/parquet-column-1.12.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/curator-framework-2.6.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/curator-recipes-2.7.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-serde-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/logback-core-1.0.13.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/parquet-hadoop-bundle-1.6.0rc3.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/audience-annotations-0.12.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/plugin-unstructured-storage-util-0.0.1-SNAPSHOT.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-yarn-server-web-proxy-2.6.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/parquet-encoding-1.12.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/aopalliance-1.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/apacheds-kerberos-codec-2.0.0-M15.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-math3-3.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/antlr-runtime-3.4.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jackson-core-asl-1.9.13.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-yarn-server-resourcemanager-2.6.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-net-3.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/gson-2.2.4.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jpam-1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jetty-all-7.6.0.v20120127.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jetty-6.1.26.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/curator-client-2.7.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-compiler-2.7.6.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/calcite-core-1.0.0-incubating.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/ant-1.9.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/parquet-jackson-1.12.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jets3t-0.9.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jsch-0.1.42.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/stringtemplate-3.2.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/antlr-2.7.7.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jetty-util-6.1.26.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/fastjson2-2.0.23.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-ant-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/paranamer-2.3.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/java-xmlbuilder-0.4.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/asm-tree-3.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/httpcore-4.1.2.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jersey-guice-1.9.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/guice-servlet-3.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/javacsv-2.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/protobuf-java-2.5.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-pool-1.6.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jdo-api-3.0.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-common-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/netty-3.6.2.Final.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jersey-core-1.9.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/slf4j-api-1.7.10.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/geronimo-jaspic_1.0_spec-1.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/avro-1.7.4.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jta-1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-cli-1.2.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jersey-server-1.9.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/xercesImpl-2.9.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-shims-0.20S-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jersey-client-1.9.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jackson-jaxrs-1.9.13.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/calcite-linq4j-1.0.0-incubating.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/datax-common-0.0.1-SNAPSHOT.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/logback-classic-1.0.13.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-yarn-api-2.7.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-yarn-server-common-2.6.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/velocity-1.5.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/lzo-core-1.0.5.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-cli-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/api-asn1-api-1.0.0-M20.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-io-2.4.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/calcite-avatica-1.0.0-incubating.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/activation-1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/groovy-all-2.1.6.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/zstd-jni-1.4.9-1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/stax-api-1.0-2.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-beanutils-core-1.8.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/httpclient-4.1.2.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-hcatalog-core-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jackson-xc-1.9.13.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/datanucleus-api-jdo-3.2.6.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/javax.annotation-api-1.3.2.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jsp-api-2.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/ST4-4.0.4.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/guice-3.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-lang-2.6.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-shims-common-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/xml-apis-1.3.04.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-exec-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-annotations-2.7.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-collections-3.2.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jettison-1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jersey-json-1.9.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/apache-log4j-extras-1.2.17.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/ant-launcher-1.9.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/datanucleus-rdbms-3.2.9.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-dbcp-1.4.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-auth-2.7.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-beanutils-1.9.2.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-common-2.7.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/leveldbjni-all-1.8.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/servlet-api-2.5.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/log4j-1.2.17.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/derby-10.11.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/api-util-1.0.0-M20.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/asm-3.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hadoop-aliyun-2.7.2.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/snappy-java-1.1.8.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-configuration-1.6.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/hive-shims-0.23-1.1.1.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/parquet-common-1.12.0.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/commons-daemon-1.0.13.jar","file:/bigdata/server/datax/plugin/reader/hdfsreader/libs/jaxb-impl-2.2.3-1.jar"],"parent":{"URLs":["file:/bigdata/server/datax/lib/commons-configuration-1.10.jar","file:/bigdata/server/datax/lib/hamcrest-core-1.3.jar","file:/bigdata/server/datax/lib/httpclient-4.5.13.jar","file:/bigdata/server/datax/lib/groovy-all-2.1.9.jar","file:/bigdata/server/datax/lib/commons-logging-1.1.1.jar","file:/bigdata/server/datax/lib/commons-lang3-3.3.2.jar","file:/bigdata/server/datax/lib/datax-transformer-0.0.1-SNAPSHOT.jar","file:/bigdata/server/datax/lib/logback-core-1.0.13.jar","file:/bigdata/server/datax/lib/janino-2.5.16.jar","file:/bigdata/server/datax/lib/commons-math3-3.1.1.jar","file:/bigdata/server/datax/lib/fluent-hc-4.5.jar","file:/bigdata/server/datax/lib/fastjson2-2.0.23.jar","file:/bigdata/server/datax/lib/datax-core-0.0.1-SNAPSHOT.jar","file:/bigdata/server/datax/lib/slf4j-api-1.7.10.jar","file:/bigdata/server/datax/lib/commons-cli-1.2.jar","file:/bigdata/server/datax/lib/datax-common-0.0.1-SNAPSHOT.jar","file:/bigdata/server/datax/lib/logback-classic-1.0.13.jar","file:/bigdata/server/datax/lib/commons-codec-1.11.jar","file:/bigdata/server/datax/lib/commons-io-2.4.jar","file:/bigdata/server/datax/lib/httpcore-4.4.13.jar","file:/bigdata/server/datax/lib/commons-lang-2.6.jar","file:/bigdata/server/datax/lib/commons-collections-3.2.1.jar","file:/bigdata/server/datax/lib/commons-beanutils-1.9.2.jar","file:/bigdata/server/datax/"],"parent":{"URLs":["file:/root/software/jdk_1.8/jre/lib/ext/cldrdata.jar","file:/root/software/jdk_1.8/jre/lib/ext/jfxrt.jar","file:/root/software/jdk_1.8/jre/lib/ext/jaccess.jar","file:/root/software/jdk_1.8/jre/lib/ext/dnsns.jar","file:/root/software/jdk_1.8/jre/lib/ext/nashorn.jar","file:/root/software/jdk_1.8/jre/lib/ext/sunpkcs11.jar","file:/root/software/jdk_1.8/jre/lib/ext/sunjce_provider.jar","file:/root/software/jdk_1.8/jre/lib/ext/zipfs.jar","file:/root/software/jdk_1.8/jre/lib/ext/sunec.jar","file:/root/software/jdk_1.8/jre/lib/ext/localedata.jar"]}}},"finalParameters":[]} 2025-11-07 17:39:33.225 [job-0] INFO HdfsReader$Job - init() ok and end... 2025-11-07 17:39:33.549 [job-0] ERROR RetryUtil - Exception when calling callable, 异常Msg:Code:[MYSQLErrCode-01], Description:[数据库用户名或者密码错误,请检查填写的账号密码或者联系DBA确认账号和密码是否正确]. - 该数据库用户名为:root 具体错误信息为:java.sql.SQLException: Access denied for user 'root'@'Hadoop02' (using password: YES) com.alibaba.datax.common.exception.DataXException: Code:[MYSQLErrCode-01], Description:[数据库用户名或者密码错误,请检查填写的账号密码或者联系DBA确认账号和密码是否正确]. - 该数据库用户名为:root 具体错误信息为:java.sql.SQLException: Access denied for user 'root'@'Hadoop02' (using password: YES) at com.alibaba.datax.common.exception.DataXException.asDataXException(DataXException.java:30) ~[datax-common-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.util.RdbmsException.asConnException(RdbmsException.java:21) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.util.DBUtil.connect(DBUtil.java:397) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.util.DBUtil.connect(DBUtil.java:387) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.util.DBUtil.access$000(DBUtil.java:22) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.util.DBUtil$3.call(DBUtil.java:322) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.util.DBUtil$3.call(DBUtil.java:319) ~[plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.common.util.RetryUtil$Retry.call(RetryUtil.java:164) ~[datax-common-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.common.util.RetryUtil$Retry.doRetry(RetryUtil.java:111) ~[datax-common-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.common.util.RetryUtil.executeWithRetry(RetryUtil.java:30) [datax-common-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.util.DBUtil.getConnection(DBUtil.java:319) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.util.DBUtil.getConnection(DBUtil.java:303) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.util.JdbcConnectionFactory.getConnecttion(JdbcConnectionFactory.java:27) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.writer.util.OriginalConfPretreatmentUtil.dealColumnConf(OriginalConfPretreatmentUtil.java:106) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.writer.util.OriginalConfPretreatmentUtil.dealColumnConf(OriginalConfPretreatmentUtil.java:147) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.writer.util.OriginalConfPretreatmentUtil.doPretreatment(OriginalConfPretreatmentUtil.java:36) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.rdbms.writer.CommonRdbmsWriter$Job.init(CommonRdbmsWriter.java:42) [plugin-rdbms-util-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.plugin.writer.mysqlwriter.MysqlWriter$Job.init(MysqlWriter.java:31) [mysqlwriter-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.core.job.JobContainer.initJobWriter(JobContainer.java:704) [datax-core-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.core.job.JobContainer.init(JobContainer.java:304) [datax-core-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.core.job.JobContainer.start(JobContainer.java:113) [datax-core-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.core.Engine.start(Engine.java:86) [datax-core-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.core.Engine.entry(Engine.java:168) [datax-core-0.0.1-SNAPSHOT.jar:na] at com.alibaba.datax.core.Engine.main(Engine.java:201) [datax-core-0.0.1-SNAPSHOT.jar:na] 2025-11-07 17:39:34.550 [job-0] ERROR RetryUtil - Exception when calling callable, 即将尝试执行第1次重试.本次重试计划等待[1000]ms,实际等待[1000]ms, 异常Msg:[Code:[MYSQLErrCode-01], Description:[数据库用户名或者密码错误,请检查填写的账号密码或者联系DBA确认账号和密码是否正确]. - 该数据库用户名为:root 具体错误信息为:java.sql.SQLException: Access denied for user 'root'@'Hadoop02' (using password: YES)] 2025-11-07 17:39:36.565 [job-0] ERROR RetryUtil - Exception when calling callable, 即将尝试执行第2次重试.本次重试计划等待[2000]ms,实际等待[2000]ms, 异常Msg:[Code:[MYSQLErrCode-01], Description:[数据库用户名或者密码错误,请检查填写的账号密码或者联系DBA确认账号和密码是否正确]. - 该数据库用户名为:root 具体错误信息为:java.sql.SQLException: Access denied for user 'root'@'Hadoop02' (using password: YES)] 2025-11-07 17:39:40.581 [job-0] ERROR RetryUtil - Exception when calling callable, 即将尝试执行第3次重试.本次重试计划等待[4000]ms,实际等待[4000]ms, 异常Msg:[Code:[MYSQLErrCode-01], Description:[数据库用户名或者密码错误,请检查填写的账号密码或者联系DBA确认账号和密码是否正确]. - 该数据库用户名为:root 具体错误信息为:java.sql.SQLException: Access denied for user 'root'@'Hadoop02' (using password: YES)] 2025-11-07 17:39:48.595 [job-0] ERROR RetryUtil - Exception when calling callable, 即将尝试执行第4次重试.本次重试计划等待[8000]ms,实际等待[8001]ms, 异常Msg:[Code:[MYSQLErrCode-01], Description:[数据库用户名或者密码错误,请检查填写的账号密码或者联系DBA确认账号和密码是否正确]. - 该数据库用户名为:root 具体错误信息为:java.sql.SQLException: Access denied for user 'root'@'Hadoop02' (using password: YES)]
最新发布
11-08
当 DataX 连接 MySQL 数据库时提示 `java.sql.SQLException: Access denied for user 'root'@'Hadoop02' (using password: YES)`,可尝试以下解决办法: #### 检查用户名和密码 仔细核对 `job.json` 文件中 `writer` 部分的 `username` 和 `password` 是否正确。示例如下: ```json { "job": { "content": [ { "writer": { "parameter": { "username": "root", "password": "your_password", // 其他配置项 } } } ] } } ``` 要确保 `password` 与 MySQL 数据库中 `root` 用户的密码一致。若密码包含特殊字符,需确保在 `job.json` 文件中正确转义。 #### 检查用户权限 确认 `root` 用户是否有从 `Hadoop02` 主机连接到 MySQL 数据库的权限。可通过以下 SQL 语句授予权限: ```sql GRANT ALL PRIVILEGES ON *.* TO 'root'@'Hadoop02' IDENTIFIED BY 'your_password' WITH GRANT OPTION; FLUSH PRIVILEGES; ``` 上述命令将授予 `root` 用户从 `Hadoop02` 主机连接到 MySQL 数据库的所有权限,并刷新权限表。 #### 检查 MySQL 配置 查看 MySQL 配置文件(通常为 `my.cnf` 或 `my.ini`),确保 MySQL 允许从 `Hadoop02` 主机连接。找到 `bind-address` 配置项,若其值为 `127.0.0.1`,则表示 MySQL 仅允许本地连接。可将其修改为服务器的 IP 地址或 `0.0.0.0`,以允许所有主机连接: ```ini bind-address = 0.0.0.0 ``` 修改完成后,重启 MySQL 服务使配置生效。 #### 检查防火墙设置 确保防火墙允许 `Hadoop02` 主机访问 MySQL 服务的端口(默认为 3306)。可通过以下命令开放 3306 端口: ```bash # 对于 CentOS 7 及以上系统 firewall-cmd --zone=public --add-port=3306/tcp --permanent firewall-cmd --reload # 对于 Ubuntu 系统 sudo ufw allow 3306/tcp ``` #### 检查 JDBC URL 格式 确保 `job.json` 文件中 `writer` 部分的 `jdbcUrl` 格式正确,且不使用 `[]` 括起来,因为写入的数据库只能有一个[^1]。示例如下: ```json { "job": { "content": [ { "writer": { "parameter": { "connection": [ { "jdbcUrl": "jdbc:mysql://your_mysql_host:3306/your_database_name", // 其他配置项 } ] } } } ] } } ```
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值