hadoop3.3.0 flink1.12 hive3.12 Я хочу интегрировать hive и flink. После настройки файла sql-client-dqfaults.yaml,
catalogs:
- name: default_catalog
type: hive
hive-conf-dir: /cdc/apache-hive-3.1.2-bin/conf
Я запускаю клиент flink sql, но сообщается о следующей ошибке.
[root@dhf4 bin]# ./sql-client.sh embedded
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/cdc/flink-1.12.0/lib/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/cdc/hadoop-3.3.0/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See https://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
No default environment specified.
Searching for '/cdc/flink-1.12.0/conf/sql-client-defaults.yaml'...found.
Reading default environment from: file:/cdc/flink-1.12.0/conf/sql-client-defaults.yaml
No session environment specified.
2021-01-20 10:12:38,179 INFO org.apache.hadoop.hive.conf.HiveConf [] - Found configuration file file:/cdc/apache-hive-3.1.2-bin/conf/hive-site.xml
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:208)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878)
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226)
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196)
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1361)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109)
at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:211)
at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:164)
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:89)
at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:384)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634)
at java.util.HashMap.forEach(HashMap.java:1289)
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633)
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632)
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185)
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138)
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867)
... 3 more
Содержимое лога следующее
[root@dhf4 bin]# cat ../log/flink-root-sql-client-dhf4.log
2021-01-20 10:12:36,246 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: jobmanager.rpc.address, localhost
2021-01-20 10:12:36,252 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: jobmanager.rpc.port, 6123
2021-01-20 10:12:36,252 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: jobmanager.memory.process.size, 1600m
2021-01-20 10:12:36,252 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: taskmanager.memory.process.size, 1728m
2021-01-20 10:12:36,252 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: taskmanager.numberOfTaskSlots, 1
2021-01-20 10:12:36,256 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: parallelism.default, 1
2021-01-20 10:12:36,256 INFO org.apache.flink.configuration.GlobalConfiguration [] - Loading configuration property: jobmanager.execution.failover-strategy, region
2021-01-20 10:12:36,394 INFO org.apache.flink.table.client.gateway.local.LocalExecutor [] - Using default environment file: file:/cdc/flink-1.12.0/conf/sql-client-defaults.yaml
2021-01-20 10:12:36,754 INFO org.apache.flink.table.client.config.entries.ExecutionEntry [] - Property 'execution.restart-strategy.type' not specified. Using default value: fallback
2021-01-20 10:12:38,179 INFO org.apache.hadoop.hive.conf.HiveConf [] - Found configuration file file:/cdc/apache-hive-3.1.2-bin/conf/hive-site.xml
2021-01-20 10:12:38,404 ERROR org.apache.flink.table.client.SqlClient [] - SQL Client must stop. Unexpected exception. This is a bug. Please consider filing an issue.
org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:878) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:226) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:196) [flink-sql-client_2.11-1.12.0.jar:1.12.0]
Caused by: java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1380) ~[hadoop-common-3.3.0.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1361) ~[hadoop-common-3.3.0.jar:?]
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536) ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554) ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448) ~[hadoop-mapreduce-client-core-3.3.0.jar:?]
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141) ~[flink-sql-connector-hive-3.1.2_2.11-1.12.0.jar:1.12.0]
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5109) ~[flink-sql-connector-hive-3.1.2_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.catalog.hive.HiveCatalog.createHiveConf(HiveCatalog.java:211) ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
at org.apache.flink.table.catalog.hive.HiveCatalog.<init>(HiveCatalog.java:164) ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
at org.apache.flink.table.catalog.hive.factories.HiveCatalogFactory.createCatalog(HiveCatalogFactory.java:89) ~[flink-connector-hive_2.12-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.gateway.local.ExecutionContext.createCatalog(ExecutionContext.java:384) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$null$5(ExecutionContext.java:634) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at java.util.HashMap.forEach(HashMap.java:1289) ~[?:1.8.0_272]
at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$6(ExecutionContext.java:633) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.gateway.local.ExecutionContext.wrapClassLoader(ExecutionContext.java:266) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:632) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:529) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:185) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:138) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:867) ~[flink-sql-client_2.11-1.12.0.jar:1.12.0]
... 3 more
Я пробовал много решений, таких как Guava Версии одинаковые, но ни одна из них не работает, есть ли другое решение?