在自己虚拟机上使用mysql作为hive元数据存储, 修改配置如下:
- <property>
- <name>javax.jdo.option.ConnectionURL</name>
- <value>jdbc:mysql://localhost:3306/metastore</value>
- <description>JDBCconnectstringforaJDBCmetastore</description>
- </property>
- <property>
- <name>javax.jdo.option.ConnectionDriverName</name>
- <value>com.mysql.jdbc.Driver</value>
- <description>DriverclassnameforaJDBCmetastore</description>
- </property>
- <property>
- <name>javax.jdo.option.ConnectionUserName</name>
- <value>hive</value>
- <description>usernametouseagainstmetastoredatabase</description>
- </property>
- <property>
- <name>javax.jdo.option.ConnectionPassword</name>
- <value>hive</value>
- <description>passwordtouseagainstmetastoredatabase</description>
- </property>
并将mysql jdbc驱动拷贝到HIVE_HOME/lib目录下.
接下去登陆hive客户端,执行show databases;命令, 异常:
- [ruizhe@localhost~]$hive
- Hivehistoryfile=/tmp/ruize/hive_job_log_ruize_201204091822_467986476.txt
- hive>showdatabases;
- FAILED:Errorinmetadata:javax.jdo.JDOFatalInternalException:Unexpectedexceptioncaught.
- NestedThrowables:
- java.lang.reflect.InvocationTargetException
- FAILED:ExecutionError,returncode1fromorg.apache.hadoop.hive.ql.exec.DDLTask
- hive>
被这个问题困扰好久, 同样的配置在自己笔记本环境上可以,但在公司机器上异常,
最后网上搜索了一把,发现解决方法如下:
- delete$HADOOP_HOME/buildandeverythingshouldbefine
切换到 HADOOP_HOME目录:
- [hadoop@localhostbuild]$pwd
- /opt/app/hadoop-0.20.2-cdh3u3/build
- [hadoop@localhostbuild]$ls
- antc++classescontribexampleshadoop-core-0.20.2-cdh3u3.jarhadoop-tools-0.20.2-cdh3u3.jarivysrctesttoolswebapps
- [hadoop@localhostbuild]$
果然在HADOOP_HOME/build目录下有新build的信息(我用ant重新build过hadoop)
直接删除build目录:
- [hadoop@localhosthadoop-0.20.2-cdh3u3]$rm-rfbuild
重新进入hive客户端:
- [ruize@localhost~]$hive
- Hivehistoryfile=/tmp/ruize/hive_job_log_ruize_201204091826_1452110220.txt
- hive>showdatabases;
- OK
- default
- Timetaken:1.786seconds
- hive>
这次OK了!