理论上来说, 按照 DataSphereStudio 的官方文档, 把相应的环境变量都设置了的
###HADOOP CONF DIR #/appcom/config/hadoop-config
HADOOP_CONF_DIR=/home/hadoop/hadoop-2.7.2/etc/hadoop
###HIVE CONF DIR #/appcom/config/hive-config
HIVE_CONF_DIR=/home/hadoop/apache-hive-2.3.3-bin/conf
###SPARK CONF DIR #/appcom/config/spark-config
SPARK_CONF_DIR=/home/hadoop/spark-2.4.8-bin-hadoop2.7/conf
但仍然报错如下:
Caused by: org.apache.linkis.common.exception.ErrorException: errCode: 30000 ,desc: Necessary environment HIVE_CONF_DIR is not exists!(必须的环境变量 HIVE_CONF_DIR 不存在!) ,ip: hadoop0004 ,port: 9102 ,serviceKind: linkis-cg-engineconnmanager
at org.apache.linkis.ecm.core.launch.ProcessEngineConnLaunch$$anonfun$launch$1.apply(ProcessEngineConnLaunch.scala:115) ~[linkis-engineconn-manager-core-1.1.1.jar:1.1.1]
at org.apache.linkis.ecm.core.launch.ProcessEngineConnLaunch$$anonfun$launch$1.apply(ProcessEngineConnLaunch.scala:112) ~[linkis-engineconn-manager-core-1.1.1.jar:1.1.1]
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) ~[scala-library-2.11.12.jar:?]
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) ~[scala-library-2.11.12.jar:?]
at org.apache.linkis.ecm.core.launch.ProcessEngineConnLaunch$class.launch(ProcessEngineConnLaunch.scala:112) ~[linkis-engineconn-manager-core-1.1.1.jar:1.1.1]
at org.apache.linkis.ecm.linux.launch.LinuxProcessEngineConnLaunch.launch(LinuxProcessEngineConnLaunch.scala:25) ~[linkis-engineconn-linux-launch-1.1.1.jar:1.1.1]
at org.apache.linkis.ecm.core.launch.EngineConnLaunchRunnerImpl.run(EngineConnLaunchRunner.scala:42) ~[linkis-engineconn-manager-core-1.1.1.jar:1.1.1]
at org.apache.linkis.ecm.server.service.impl.AbstractEngineConnLaunchService$$anonfun$launchEngineConn$1.apply$mcV$sp(AbstractEngineConnLaunchService.scala:88) ~[linkis-engineconn-manager-server-1.1.1.jar:1.1.1]
at org.apache.linkis.ecm.server.service.impl.AbstractEngineConnLaunchService$$anonfun$launchEngineConn$1.apply(AbstractEngineConnLaunchService.scala:86) ~[linkis-engineconn-manager-server-1.1.1.jar:1.1.1]
at org.apache.linkis.ecm.server.service.impl.AbstractEngineConnLaunchService$$anonfun$launchEngineConn$1.apply(AbstractEngineConnLaunchService.scala:86) ~[linkis-engineconn-manager-server-1.1.1.jar:1.1.1]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) ~[linkis-common-1.1.1.jar:1.1.1]
at org.apache.linkis.ecm.server.service.impl.AbstractEngineConnLaunchService.launchEngineConn(AbstractEngineConnLaunchService.scala:110) ~[linkis-engineconn-manager-server-1.1.1.jar:1.1.1]
at org.apache.linkis.ecm.server.service.impl.LinuxProcessEngineConnLaunchService.launchEngineConn(LinuxProcessEngineConnLaunchService.scala:36) ~[linkis-engineconn-manager-server-1.1.1.jar:1.1.1]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_262]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_262]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_262]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_262]
at org.apache.linkis.rpc.message.method.MessageExecutor.executeOneMethod(MessageExecutor.java:82) ~[linkis-rpc-1.1.1.jar:1.1.1]
... 111 more
分析源码:
ProcessEngineConnLaunch.scala:115
-- BDPConfiguration
-- val DEFAULT_PROPERTY_FILE_NAME = "linkis.properties"
找到 linkis.properties 文件相应的配置是小写带点
#hadoop/hive/spark config
hadoop.config.dir=/home/hadoop/hadoop-2.7.2/etc/hadoop
hive.config.dir=/home/hadoop/apache-hive-2.3.3-bin/conf
spark.config.dir=/home/hadoop/spark-2.4.8-bin-hadoop2.7/conf
拿法:
val propsValue = sysProps.get(key).orElse(sys.props.get(key))
解决办法: 我们把 HIVE_CONF_DIR 放进环境变量
vi /etc/profile
在行尾写入以下内容
export HIVE_CONF_DIR=/home/hadoop/apache-hive-2.3.3-bin/conf
保存后
source /etc/profile
通过如下命令检查是否正确
echo $HIVE_CONF_DIR
输出 /home/hadoop/apache-hive-2.3.3-bin/conf
重启服务
cd /home/hadoop/dss_linkis/bin
sh stop-all.sh
sh start-all.sh
报错: Caused by: java.io.IOException: no permission to mkdir path /home/hadoop/appcom/tmp/hadoop/20230815/hive/20e15726-e575-43d5-a4ee-52f20390e110
chmod -R 777 /home/hadoop/appcom/tmp
报新的错误: 21304, Task is Failed,errorMsg: FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
发现日志文件 linkis-cg-entrance.log 有警告:
2023-08-15 14:47:36.464 [WARN ] [Orchestrator-Listener-Asyn-Thread-Thread-0] o.a.l.e.l.HDFSCacheLogWriter (93) [apply] - hdfs:///tmp/linkis/log/2023-08-15/IDE/hadoop/20.log error when write query log to outputStream. java.lang.NullPointerException: null
根据20.log这个文件,看到报错
NestedThrowablesStackTrace:
java.lang.ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory
这个错误写的很明确,就是找不到这个类, 那么环境变量下面肯定没有
我们在HIVE_HOME/lib 发现 datanucleus-api-jdo-4.2.4.jar 这个jar包是存在, 但是报错信息显示不存在, 那么显然不是从这里找的, 根据如下错误栈分析
2023-08-16 17:15:28.043 WARN [Linkis-Default-Scheduler-Thread-15] org.apache.hadoop.hive.metastore.HiveMetaStore 653 createDefaultDB - Retrying creating default database after error: Class org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not found. javax.jdo.JDOFatalUserException: Class org.datanucleus.api.jdo.JDOPersistenceManagerFactory was not found.
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1175) ~[hive-exec-2.3.3.jar:2.3.3]
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) ~[hive-exec-2.3.3.jar:2.3.3]
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:519) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:548) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:403) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:340) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:301) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76) ~[hadoop-common-2.7.2.jar:?]
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136) ~[hadoop-common-2.7.2.jar:?]
at org.apache.hadoop.hive.metastore.RawStoreProxy.(RawStoreProxy.java:58) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:624) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:590) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:584) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:651) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:427) ~[hive-exec-2.3.3.jar:2.3.3]
at sun.reflect.GeneratedMethodAccessor80.invoke(Unknown Source) ~[?:?]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_262]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_262]
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:148) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.(RetryingHMSHandler.java:79) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:92) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6893) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.(HiveMetaStoreClient.java:164) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.(SessionHiveMetaStoreClient.java:70) ~[hive-exec-2.3.3.jar:2.3.3]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_262]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[?:1.8.0_262]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_262]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_262]
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1699) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.(RetryingMetaStoreClient.java:83) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3600) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3652) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3632) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3894) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:248) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.(Hive.java:388) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:332) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:312) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:288) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.createHiveDB(BaseSemanticAnalyzer.java:236) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.(BaseSemanticAnalyzer.java:215) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.(SemanticAnalyzer.java:362) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.parse.CalcitePlanner.(CalcitePlanner.java:267) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzerFactory.get(SemanticAnalyzerFactory.java:318) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:484) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:377) ~[hive-exec-2.3.3.jar:2.3.3]
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:366) ~[hive-exec-2.3.3.jar:2.3.3]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_262]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_262]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_262]
at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_262]
at org.apache.linkis.engineplugin.hive.executor.HiveDriverProxy.compile(HiveEngineConnExecutor.scala:496) ~[linkis-engineplugin-hive-1.1.1.jar:1.1.1]
at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anonfun$org$apache$linkis$engineplugin$hive$executor$HiveEngineConnExecutor$$executeHQL$1.apply$mcV$sp(HiveEngineConnExecutor.scala:175) ~[linkis-engineplugin-hive-1.1.1.jar:1.1.1]
at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anonfun$org$apache$linkis$engineplugin$hive$executor$HiveEngineConnExecutor$$executeHQL$1.apply(HiveEngineConnExecutor.scala:174) ~[linkis-engineplugin-hive-1.1.1.jar:1.1.1]
at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anonfun$org$apache$linkis$engineplugin$hive$executor$HiveEngineConnExecutor$$executeHQL$1.apply(HiveEngineConnExecutor.scala:174) ~[linkis-engineplugin-hive-1.1.1.jar:1.1.1]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) ~[linkis-common-1.1.1.jar:1.1.1]
at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.org$apache$linkis$engineplugin$hive$executor$HiveEngineConnExecutor$$executeHQL(HiveEngineConnExecutor.scala:183) ~[linkis-engineplugin-hive-1.1.1.jar:1.1.1]
at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anon$1.run(HiveEngineConnExecutor.scala:139) ~[linkis-engineplugin-hive-1.1.1.jar:1.1.1]
at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anon$1.run(HiveEngineConnExecutor.scala:132) ~[linkis-engineplugin-hive-1.1.1.jar:1.1.1]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_262]
at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_262]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) ~[hadoop-common-2.7.2.jar:?]
at org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.executeLine(HiveEngineConnExecutor.scala:132) ~[linkis-engineplugin-hive-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10$$anonfun$apply$11.apply(ComputationExecutor.scala:181) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10$$anonfun$apply$11.apply(ComputationExecutor.scala:180) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) ~[linkis-common-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10.apply(ComputationExecutor.scala:182) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2$$anonfun$apply$10.apply(ComputationExecutor.scala:176) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at scala.collection.immutable.Range.foreach(Range.scala:160) ~[scala-library-2.11.12.jar:?]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2.apply(ComputationExecutor.scala:175) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$toExecuteTask$2.apply(ComputationExecutor.scala:151) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) ~[linkis-common-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.toExecuteTask(ComputationExecutor.scala:228) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$3.apply(ComputationExecutor.scala:243) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor$$anonfun$3.apply(ComputationExecutor.scala:243) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.common.utils.Utils$.tryFinally(Utils.scala:61) ~[linkis-common-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:55) ~[linkis-accessible-executor-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:49) ~[linkis-accessible-executor-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.ensureOp(ComputationExecutor.scala:135) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.execute.ComputationExecutor.execute(ComputationExecutor.scala:242) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl.org$apache$linkis$engineconn$computation$executor$service$TaskExecutionServiceImpl$$executeTask(TaskExecutionServiceImpl.scala:288) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2$$anonfun$run$2.apply$mcV$sp(TaskExecutionServiceImpl.scala:221) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2$$anonfun$run$2.apply(TaskExecutionServiceImpl.scala:219) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2$$anonfun$run$2.apply(TaskExecutionServiceImpl.scala:219) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at org.apache.linkis.common.utils.Utils$.tryCatch(Utils.scala:40) ~[linkis-common-1.1.1.jar:1.1.1]
at org.apache.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:69) ~[linkis-common-1.1.1.jar:1.1.1]
at org.apache.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$2.run(TaskExecutionServiceImpl.scala:219) ~[linkis-computation-engineconn-1.1.1.jar:1.1.1]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_262]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_262]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_262]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) ~[?:1.8.0_262]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_262]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_262]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_262]
Caused by: java.lang.ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:382) ~[?:1.8.0_262]
at java.lang.ClassLoader.loadClass(ClassLoader.java:418) ~[?:1.8.0_262]
at java.lang.ClassLoader.loadClass(ClassLoader.java:351) ~[?:1.8.0_262]
at java.lang.Class.forName0(Native Method) ~[?:1.8.0_262]
at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_262]
at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018) ~[hive-exec-2.3.3.jar:2.3.3]
at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016) ~[hive-exec-2.3.3.jar:2.3.3]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_262]
at javax.jdo.JDOHelper.forName(JDOHelper.java:2015) ~[hive-exec-2.3.3.jar:2.3.3]
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162) ~[hive-exec-2.3.3.jar:2.3.3]
... 99 more
调用是从这里面出发的
org.apache.linkis.engineplugin.hive.executor.HiveEngineConnExecutor
对应的jar包是 linkis-engineplugin-hive-1.1.1.jar
cd /home/hadoop/dss_linkis/linkis/lib
find . -name linkis-engineplugin-hive-1.1.1.jar
找到文件
./linkis-engineconn-plugins/hive/dist/v2.3.3/lib/linkis-engineplugin-hive-1.1.1.jar
./linkis-engineconn-plugins/hive/plugin/2.3.3/linkis-engineplugin-hive-1.1.1.jar
在目录
./linkis-engineconn-plugins/hive/dist/v2.3.3/lib/ 看到了hive-exec-2.3.3.jar
这里面不用去找, 直接猜 缺少的 jar 包 datanucleus-api-jdo-4.2.4.jar 应该跟 hive-exec-2.3.3.jar 在一块
于是 cp $HIVE_HOME/lib/datanucleus-api-jdo-4.2.4.jar ./
然后再跑发现报错信息变了, 在意料之中, 因为拷贝的时候少拷贝了jar包
2023-08-17 15:07:25.069 ERROR [Linkis-Default-Scheduler-Thread-10] DataNucleus.Datastore 125 error - Exception thrown creating StoreManager. See the nested exception org.datanucleus.exceptions.NucleusUserException: There is no available StoreManager of type "rdbms". Make sure that you have put the relevant DataNucleus store plugin in your CLASSPATH and if defining a connection via JNDI or DataSource you also need to provide persistence property "datanucleus.storeManagerType"
执行
cp $HIVE_HOME/lib/datanucleus-rdbms-4.1.19.jar ./
报错信息变成其它,请在后续文章查看解决办法