Hive安装问题及解决

2023-10-13 大数据Hive

# 日志jar包冲突

根据提示信息可以分析到,项目存在多个SLF4J绑定导致冲突,解决这个问题的一种方法是排除一个绑定,只保留一个,删除其中一个即可。

  1. log4j-slf4j-impl-2.17.1.jar:此绑定位于Hive的lib目录中。
  2. slf4j-reload4j-1.7.35.jar:此绑定位于Hadoop的common/lib目录中。
[root@flume apache-hive-3.1.3-bin]# bin/schematool -dbType mysql -initSchema
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/soft/hive/apache-hive-3.1.3-bin/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/soft/hadoop/hadoop-3.2.4/share/hadoop/common/lib/slf4j-reload4j-1.7.35.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
        at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
        at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
        at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
        at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5144)
        at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5107)
        at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96)
        at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:236)

# guava版本冲突

继续运行发现guava版本也冲突了,删除Hive的guava-19.0.jar,复制/usr/local/soft/hadoop/hadoop-3.2.4/share/hadoop/common/lib/目录下guava-27.0-jre.jar/usr/local/soft/hive/apache-hive-3.1.3-bin/lib/中来

[root@flume apache-hive-3.1.3-bin]# bin/schematool -dbType mysql -initSchema
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
        at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
        at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
        at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
        at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
        at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5144)
        at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5107)
        at org.apache.hive.beeline.HiveSchemaTool.<init>(HiveSchemaTool.java:96)
        at org.apache.hive.beeline.HiveSchemaTool.main(HiveSchemaTool.java:1473)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:236)

# 修改配置文件错误

这个错误很常见,提示信息显示hive-site.xml配置文件的3215行有问题,我们打卡配置文件找到第3215行

root@flume apache-hive-3.1.3-bin]# bin/schematool -dbType mysql -initSchema
Exception in thread "main" java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Illegal character entity: expansion character (code 0x8
 at [row,col,system-id]: [3215,96,"file:/usr/local/soft/hive/apache-hive-3.1.3-bin/conf/hive-site.xml"]
        at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3040)
        at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2989)
        at org.apache.hadoop.conf.Configuration.loadProps(Configuration.java:2862)
        at org.apache.hadoop.conf.Configuration.addResourceObject(Configuration.java:1012)
        at org.apache.hadoop.conf.Configuration.addResource(Configuration.java:917)
        at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5154)
        at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5107)

其实也就是多个回车的问题,我们直接删除注释信息即可

<description>
    Ensures commands with OVERWRITE (such as INSERT OVERWRITE) acquire Exclusive locks for&#8;transactional tables.  This ensures that inserts (w/o overwrite) running concurrently
    are not hidden by the INSERT OVERWRITE.
</description>
<!-- 修改后 -->
<description></description>

# 数据库连接失败

可能是账号或密码错误,也可能是数据库没开启远程连接,linux远程连接开启自行百度搜命令,Windows开启远程访问MySQL (opens new window)

root@flume apache-hive-3.1.3-bin]# bin/schematool -dbType mysql -initSchema
Metastore connection URL:        jdbc:mysql://192.168.95.112:3306/hive?serverTimezone=Asia/Shanghai
Metastore Connection Driver :    com.mysql.cj.jdbc.Driver
Metastore connection User:       root
org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version.
Underlying cause: java.sql.SQLException : Access denied for user 'root'@'LAPTOP-M8G7FNEL' (using password: YES)
SQL Error code: 1045
Use --verbose for detailed stacktrace.
*** schemaTool failed ***

# Unknow database

配置指定的数据库名称为hive,你需要先创建这个数据库。

上次更新: 4 个月前