Linkis 1.0.2保姆级安装手册&踩坑实录(二)

972 阅读13分钟

💡 经过昨天一天的踩坑之后,不禁对自己的技术产生了怀疑。。为什么别人的安装都如此的顺利😂没办法,硬着头皮继续上吧

继续踩坑之旅

重新编译

修改依赖

晚上我躺在床上的时候,左思右想问题出在哪里。突然想到,我们使用的都是cdh版本的hadoop和hive,会不是和apache版本的有些不一样呢?于是赶紧把依赖修改了再试下 linkis-engineconn-plugins/engineconn-plugins/hive

<properties>
    <hive.version>1.1.0-cdh5.8.3</hive.version>
</properties>

但是,这样会有问题

Could not find artifact org.apache.hive:hive-common:pom:1.1.0-cdh5.8.3 in central (https://repo.maven.apache.org/maven2)

查看了一下maven,发现需要配置上cloudera的源,这里就直接配置在项目中了

<repositories>
    <!--防止cloudera找不到,加上阿里源-->
    <repository>
        <id>aliyun</id>
        <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
        <releases>
            <enabled>true</enabled>
        </releases>
    </repository>
    <repository>
        <id>cloudera</id>
        <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
        <releases>
            <enabled>true</enabled>
        </releases>
    </repository>
</repositories>

依赖修改完成

升级scala 2.12

修改如下几个地方

<properties>
    ...
    <scala.version>2.12.8</scala.version>
    ...
    <scala.binary.version>2.12</scala.binary.version>
    ...
</properties>

开始编译

编译过程真的是非常坎坷,遇到了各种各样的问题,在这里列举一下

image.png

image.png

image.png

总的来说,基本都是升级scala2.12后,scala-docs生成遇到的问题

这里就直接上解决方法———

<plugin>
    <groupId>net.alchim31.maven</groupId>
    <artifactId>scala-maven-plugin</artifactId>
    <version>3.2.2</version>
    <executions>
        <execution>
            <id>eclipse-add-source</id>
            <goals>
                <goal>add-source</goal>
            </goals>
        </execution>
        <execution>
            <id>scala-compile-first</id>
            <phase>process-resources</phase>
            <goals>
                <goal>compile</goal>
            </goals>
        </execution>
        <execution>
            <id>scala-test-compile-first</id>
            <phase>process-test-resources</phase>
            <goals>
                <goal>testCompile</goal>
            </goals>
        </execution>
        <!--直接注释掉-->
<!--                        <execution>-->
<!--                            <id>attach-scaladocs</id>-->
<!--                            <phase>verify</phase>-->
<!--                            <goals>-->
<!--                                <goal>doc-jar</goal>-->
<!--                            </goals>-->
<!--                        </execution>-->
    </executions>
    <configuration>
        <scalaVersion>${scala.version}</scalaVersion>
        <recompileMode>incremental</recompileMode>
        <useZincServer>true</useZincServer>
    </configuration>
</plugin>

直接注释掉scala-maven-plugin里面的scaladocs相关的内容就好😅

image.png

编译完成,速度比原来快了将近一倍。。。

安装部署

过程就不多说了,参照# Linkis 1.0.2保姆级安装手册&踩坑实录(一)就好

验证

hive引擎

[codeweaver@bd15-21-32-217 bin]$ ./linkis-cli-hive -code "SELECT * from mob_bg_devops.servers_exps_weekly_with_wh;" -submitUser codeweaver -proxyUser codeweaver
[INFO] LogFile path: /home/codeweaver/linkis_scala_2.12/logs/linkis-cli//linkis-client.codeweaver.log.20211230144640746919669
[INFO] User does not provide usr-configuration file. Will use default config
[INFO] connecting to linkis gateway:http://127.0.0.1:9001
JobId:9
TaskId:9
ExecId:exec_id018019linkis-cg-entrancebd15-21-32-217:9104LINKISCLI_codeweaver_hive_0
[INFO] Job is successfully submitted!

2021-12-30 14:46:44.046 INFO Program is substituting variables for you
2021-12-30 14:46:44.046 INFO Variables substitution ended successfully
2021-12-30 14:46:44.046 WARN You submitted a sql without limit, DSS will add limit 5000 to your sql
2021-12-30 14:46:44.046 INFO SQL code check has passed
job is scheduled.
2021-12-30 14:46:45.046 INFO Your job is Scheduled. Please wait it to run.
Job with jobId : LINKISCLI_codeweaver_hive_0 and execID : LINKISCLI_codeweaver_hive_0 submitted
Your job is being scheduled by orchestrator.
2021-12-30 14:46:46.046 INFO You have submitted a new job, script code (after variable substitution) is
************************************SCRIPT CODE************************************
SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
************************************SCRIPT CODE************************************
2021-12-30 14:46:46.046 INFO Your job is accepted,  jobID is LINKISCLI_codeweaver_hive_0 and taskID is 9 in ServiceInstance(linkis-cg-entrance, bd15-21-32-217:9104). Please wait it to be scheduled
2021-12-30 14:46:46.046 INFO job is running.
2021-12-30 14:46:46.046 INFO Your job is Running now. Please wait it to complete.
Job with jobGroupId : 9 and subJobId : 9 was submitted to Orchestrator.
2021-12-30 14:46:46.046 INFO Background is starting a new engine for you, it may take several seconds, please wait
2021-12-30 14:47:11.047 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, bd15-21-32-217:23469) /tmp/codeweaver/linkis_dev/codeweaver/workDir/dd14b81d-f4ed-48f4-939e-696324aed4fe/logs
HiveEngineExecutor_0 >> SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
2021-12-30 14:47:13.144 ERROR [Linkis-Default-Scheduler-Thread-3] com.webank.wedatasphere.linkis.engineplugin.hive.executor.HiveEngineConnExecutor 200 com$webank$wedatasphere$linkis$engineplugin$hive$executor$HiveEngineConnExecutor$$executeHQL - query failed, reason : java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]
	at scala.collection.immutable.Range.foreach(Range.scala:158) ~[scala-library-2.12.8.jar:?]
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [scala-library-2.12.8.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
Caused by: java.lang.NoClassDefFoundError: org/apache/zookeeper/KeeperException$NoNodeException
	at java.lang.Class.forName0(Native Method) ~[?:1.8.0_181]
	at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_181]
	... 39 more
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_181]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_181]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) ~[?:1.8.0_181]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_181]
	at java.lang.Class.forName0(Native Method) ~[?:1.8.0_181]
	at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_181]
	... 39 more
2021-12-30 14:47:13.171 ERROR [Linkis-Default-Scheduler-Thread-3] com.webank.wedatasphere.linkis.engineplugin.hive.executor.HiveEngineConnExecutor 57 error - execute code failed! java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]
	at scala.collection.immutable.Range.foreach(Range.scala:158) ~[scala-library-2.12.8.jar:?]
	at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:39) [linkis-common-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:68) [linkis-common-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1.run(TaskExecutionServiceImpl.scala:170) [linkis-computation-engineconn-1.0.2.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
Caused by: java.lang.NoClassDefFoundError: org/apache/zookeeper/KeeperException$NoNodeException
	at java.lang.Class.forName0(Native Method) ~[?:1.8.0_181]
	at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_181]
	at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2013) ~[hadoop-common-2.6.0.jar:?]
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1978) ~[hadoop-common-2.6.0.jar:?]
	at org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager.getLockManager(DummyTxnManager.java:70) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager.acquireLocks(DummyTxnManager.java:101) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.acquireLocksAndOpenTxn(Driver.java:1032) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1308) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1127) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	... 39 more
Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException$NoNodeException
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_181]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_181]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) ~[?:1.8.0_181]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_181]
	at java.lang.Class.forName0(Native Method) ~[?:1.8.0_181]
	at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_181]
	at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2013) ~[hadoop-common-2.6.0.jar:?]
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1978) ~[hadoop-common-2.6.0.jar:?]
	at org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager.getLockManager(DummyTxnManager.java:70) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager.acquireLocks(DummyTxnManager.java:101) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.acquireLocksAndOpenTxn(Driver.java:1032) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1308) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1127) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	... 39 more
2021-12-30 14:47:13.189 ERROR [Linkis-Default-Scheduler-Thread-3] com.webank.wedatasphere.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl 57 error - null java.lang.reflect.InvocationTargetException: null
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
	at com.webank.wedatasphere.linkis.engineplugin.hive.executor.HiveDriverProxy.run(HiveEngineConnExecutor.scala:456) ~[linkis-engineplugin-hive-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.com$webank$wedatasphere$linkis$engineplugin$hive$executor$HiveEngineConnExecutor$$executeHQL(HiveEngineConnExecutor.scala:163) ~[linkis-engineplugin-hive-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anon$1.run(HiveEngineConnExecutor.scala:127) ~[linkis-engineplugin-hive-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineplugin.hive.executor.HiveEngineConnExecutor$$anon$1.run(HiveEngineConnExecutor.scala:120) ~[linkis-engineplugin-hive-1.0.2.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) ~[hadoop-common-2.6.0.jar:?]
	at com.webank.wedatasphere.linkis.engineplugin.hive.executor.HiveEngineConnExecutor.executeLine(HiveEngineConnExecutor.scala:120) ~[linkis-engineplugin-hive-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$toExecuteTask$14(ComputationExecutor.scala:179) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:39) ~[linkis-common-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$toExecuteTask$13(ComputationExecutor.scala:181) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$toExecuteTask$13$adapted(ComputationExecutor.scala:174) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at scala.collection.immutable.Range.foreach(Range.scala:158) ~[scala-library-2.12.8.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$toExecuteTask$1(ComputationExecutor.scala:174) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.common.utils.Utils$.tryFinally(Utils.scala:60) ~[linkis-common-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.execute.ComputationExecutor.toExecuteTask(ComputationExecutor.scala:222) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.execute.ComputationExecutor.$anonfun$execute$2(ComputationExecutor.scala:237) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.common.utils.Utils$.tryFinally(Utils.scala:60) ~[linkis-common-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:54) ~[linkis-accessible-executor-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.acessible.executor.entity.AccessibleExecutor.ensureIdle(AccessibleExecutor.scala:48) ~[linkis-accessible-executor-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.execute.ComputationExecutor.ensureOp(ComputationExecutor.scala:133) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.execute.ComputationExecutor.execute(ComputationExecutor.scala:237) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl.com$webank$wedatasphere$linkis$engineconn$computation$executor$service$TaskExecutionServiceImpl$$executeTask(TaskExecutionServiceImpl.scala:239) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1.$anonfun$run$1(TaskExecutionServiceImpl.scala:172) ~[linkis-computation-engineconn-1.0.2.jar:?]
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [scala-library-2.12.8.jar:?]
	at com.webank.wedatasphere.linkis.common.utils.Utils$.tryCatch(Utils.scala:39) [linkis-common-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.common.utils.Utils$.tryAndWarn(Utils.scala:68) [linkis-common-1.0.2.jar:?]
	at com.webank.wedatasphere.linkis.engineconn.computation.executor.service.TaskExecutionServiceImpl$$anon$1.run(TaskExecutionServiceImpl.scala:170) [linkis-computation-engineconn-1.0.2.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
Caused by: java.lang.NoClassDefFoundError: org/apache/zookeeper/KeeperException$NoNodeException
	at java.lang.Class.forName0(Native Method) ~[?:1.8.0_181]
	at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_181]
	at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2013) ~[hadoop-common-2.6.0.jar:?]
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1978) ~[hadoop-common-2.6.0.jar:?]
	at org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager.getLockManager(DummyTxnManager.java:70) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager.acquireLocks(DummyTxnManager.java:101) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.acquireLocksAndOpenTxn(Driver.java:1032) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1308) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1127) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	... 39 more
Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException$NoNodeException
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_181]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_181]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) ~[?:1.8.0_181]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_181]
	at java.lang.Class.forName0(Native Method) ~[?:1.8.0_181]
	at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_181]
	at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2013) ~[hadoop-common-2.6.0.jar:?]
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1978) ~[hadoop-common-2.6.0.jar:?]
	at org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager.getLockManager(DummyTxnManager.java:70) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.lockmgr.DummyTxnManager.acquireLocks(DummyTxnManager.java:101) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.acquireLocksAndOpenTxn(Driver.java:1032) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1308) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1127) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1115) ~[hive-exec-1.1.0-cdh5.8.3.jar:1.1.0-cdh5.8.3]
	... 39 more
2021-12-30 14:47:13.047 ERROR Task is Failed,errorMsg: null
2021-12-30 14:47:13.047 INFO job is completed.
2021-12-30 14:47:13.047 INFO Task creation time(任务创建时间): 2021-12-30 14:46:43, Task scheduling time(任务调度时间): 2021-12-30 14:46:45, Task start time(任务开始时间): 2021-12-30 14:46:46, Mission end time(任务结束时间): 2021-12-30 14:47:13
2021-12-30 14:47:13.047 INFO Your mission(您的任务) 9 The total time spent is(总耗时时间为): 29.82021-12-30 14:47:13.047 INFO Sorry. Your job completed with a status Failed. You can view logs for the reason.

[INFO] Job failed! Will not try get execute result.
============Result:================
TaskId:9
ExecId: exec_id018019linkis-cg-entrancebd15-21-32-217:9104LINKISCLI_codeweaver_hive_0
User:codeweaver
Current job status:FAILED
extraMsg:
errDesc: 21304, Task is Failed,errorMsg: null

############Execute Error!!!########

依旧是包zookeeper相关的包依赖缺失,看起来要额外添加 从本地的maven仓库里找到了zookeeper-3.4.5-cdh5.8.3.jar,放到hive引擎的lib中,删除本地zip,重启

[codeweaver@bd15-21-32-217 bin]$ ./linkis-cli-hive -code "SELECT * from mob_bg_devops.servers_exps_weekly_with_wh;" -submitUser codeweaver -proxyUser codeweaver
[INFO] LogFile path: /home/codeweaver/linkis_scala_2.12/logs/linkis-cli//linkis-client.codeweaver.log.20211230145402457036213
[INFO] User does not provide usr-configuration file. Will use default config
[INFO] connecting to linkis gateway:http://127.0.0.1:9001
JobId:10
TaskId:10
ExecId:exec_id018019linkis-cg-entrancebd15-21-32-217:9104LINKISCLI_codeweaver_hive_0
[INFO] Job is successfully submitted!

2021-12-30 14:54:06.054 INFO Program is substituting variables for you
2021-12-30 14:54:06.054 INFO Variables substitution ended successfully
2021-12-30 14:54:06.054 WARN You submitted a sql without limit, DSS will add limit 5000 to your sql
2021-12-30 14:54:06.054 INFO SQL code check has passed
job is scheduled.
2021-12-30 14:54:07.054 INFO Your job is Scheduled. Please wait it to run.
Job with jobId : LINKISCLI_codeweaver_hive_0 and execID : LINKISCLI_codeweaver_hive_0 submitted
Your job is being scheduled by orchestrator.
2021-12-30 14:54:07.054 INFO You have submitted a new job, script code (after variable substitution) is
************************************SCRIPT CODE************************************
SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
************************************SCRIPT CODE************************************
2021-12-30 14:54:07.054 INFO Your job is accepted,  jobID is LINKISCLI_codeweaver_hive_0 and taskID is 10 in ServiceInstance(linkis-cg-entrance, bd15-21-32-217:9104). Please wait it to be scheduled
2021-12-30 14:54:07.054 INFO job is running.
2021-12-30 14:54:07.054 INFO Your job is Running now. Please wait it to complete.
Job with jobGroupId : 10 and subJobId : 10 was submitted to Orchestrator.
2021-12-30 14:54:07.054 INFO Background is starting a new engine for you, it may take several seconds, please wait
2021-12-30 14:54:33.054 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, bd15-21-32-217:15049) /tmp/codeweaver/linkis_dev/codeweaver/workDir/66bd0bf8-234a-4f24-af8f-af7e837b4df1/logs
HiveEngineExecutor_0 >> SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
Time taken: 2.8 秒, begin to fetch results.
Fetched  6 col(s) : 0 row(s) in hive
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) ~[scala-library-2.12.8.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]
	at scala.collection.immutable.Range.foreach(Range.scala:158) ~[scala-library-2.12.8.jar:?]
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [scala-library-2.12.8.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
Caused by: java.io.IOException: java.lang.RuntimeException: Error in configuring object
	... 44 more
Caused by: java.lang.RuntimeException: Error in configuring object
	... 44 more
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
	... 44 more
Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzopCodec not found.
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
Caused by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzopCodec not found
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
	... 44 more
Your subjob : 10 execue with state succeed, has 1 resultsets.
Congratuaions! Your job : LINKISCLI_codeweaver_hive_0 executed with status succeed and 0 results.
2021-12-30 14:54:37.054 INFO job is completed.
2021-12-30 14:54:37.054 INFO Task creation time(任务创建时间): 2021-12-30 14:54:05, Task scheduling time(任务调度时间): 2021-12-30 14:54:07, Task start time(任务开始时间): 2021-12-30 14:54:07, Mission end time(任务结束时间): 2021-12-30 14:54:37
2021-12-30 14:54:37.054 INFO Your mission(您的任务) 10 The total time spent is(总耗时时间为): 32.02021-12-30 14:54:37.054 INFO Congratulations. Your job completed with status Success.

胜利在望了,这次是缺失hadoop-lzo相关的包,找到集群上cdh的hadoop-lzo包,放到hive的lib目录下重启

2021-12-30 15:23:41.501 ERROR [Linkis-Default-Scheduler-Thread-3] com.hadoop.compression.lzo.GPLNativeCodeLoader 36 <clinit> - Could not load native gpl library java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path
 at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867) ~[?:1.8.0_181]
 at java.lang.Runtime.loadLibrary0(Runtime.java:870) ~[?:1.8.0_181]
 at java.lang.System.loadLibrary(System.java:1122) ~[?:1.8.0_181]
 at com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:32) ~[hadoop-lzo-0.4.15-cdh5.7.6.jar:?]
 at com.hadoop.compression.lzo.LzoCodec.<clinit>(LzoCodec.java:71) ~[hadoop-lzo-0.4.15-cdh5.7.6.jar:?]
 at java.lang.Class.forName0(Native Method) ~[?:1.8.0_181]
 at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_181]
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
 at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
 at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
 at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) ~[scala-library-2.12.8.jar:?]
 at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]
 at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]
 at scala.collection.immutable.Range.foreach(Range.scala:158) ~[scala-library-2.12.8.jar:?]
 at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [scala-library-2.12.8.jar:?]
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181]
 at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181]
 at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_181]
 at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_181]
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
 at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
2021-12-30 15:23:41.510 ERROR [Linkis-Default-Scheduler-Thread-3] com.hadoop.compression.lzo.LzoCodec 81 <clinit> - Cannot load native-lzo without native-hadoop
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
 at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
 at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) ~[scala-library-2.12.8.jar:?]
 at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]
 at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]
 at scala.collection.immutable.Range.foreach(Range.scala:158) ~[scala-library-2.12.8.jar:?]
 at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [scala-library-2.12.8.jar:?]
 at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181]
 at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181]
 at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_181]
 at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_181]
 at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
 at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
 at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/util/StopWatch
 ... 44 more
 at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_181]
 at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_181]
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) ~[?:1.8.0_181]
 at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_181]
 ... 44 more
Your subjob : 14 execue with state succeed, has 1 resultsets.

这。。好像是版本不对。从maven上下载了官方的hadoop-lzo-0.4.20.jar再试一下

[codeweaver@bd15-21-32-217 bin]$ ./linkis-cli-hive -code "SELECT * from mob_bg_devops.servers_exps_weekly_with_wh;" -submitUser codeweaver -proxyUser codeweaver
[INFO] LogFile path: /home/codeweaver/linkis_scala_2.12/logs/linkis-cli//linkis-client.codeweaver.log.20211230153437223797106
[INFO] User does not provide usr-configuration file. Will use default config
[INFO] connecting to linkis gateway:http://127.0.0.1:9001
JobId:16
TaskId:16
ExecId:exec_id018019linkis-cg-entrancebd15-21-32-217:9104LINKISCLI_codeweaver_hive_0
[INFO] Job is successfully submitted!

2021-12-30 15:34:40.034 INFO Program is substituting variables for you
2021-12-30 15:34:40.034 INFO Variables substitution ended successfully
2021-12-30 15:34:41.034 WARN You submitted a sql without limit, DSS will add limit 5000 to your sql
2021-12-30 15:34:41.034 INFO SQL code check has passed
job is scheduled.
2021-12-30 15:34:42.034 INFO Your job is Scheduled. Please wait it to run.
Job with jobId : LINKISCLI_codeweaver_hive_0 and execID : LINKISCLI_codeweaver_hive_0 submitted
Your job is being scheduled by orchestrator.
2021-12-30 15:34:42.034 INFO You have submitted a new job, script code (after variable substitution) is
************************************SCRIPT CODE************************************
SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
************************************SCRIPT CODE************************************
2021-12-30 15:34:42.034 INFO Your job is accepted,  jobID is LINKISCLI_codeweaver_hive_0 and taskID is 16 in ServiceInstance(linkis-cg-entrance, bd15-21-32-217:9104). Please wait it to be scheduled
2021-12-30 15:34:42.034 INFO job is running.
2021-12-30 15:34:42.034 INFO Your job is Running now. Please wait it to complete.
Job with jobGroupId : 16 and subJobId : 16 was submitted to Orchestrator.
2021-12-30 15:34:42.034 INFO Background is starting a new engine for you, it may take several seconds, please wait
2021-12-30 15:35:04.035 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, bd15-21-32-217:35271) /tmp/codeweaver/linkis_dev/codeweaver/workDir/2ab2fb15-0aef-48e1-a36d-c42a7ab15b7e/logs
HiveEngineExecutor_0 >> SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
Time taken: 2.6 秒, begin to fetch results.
Fetched  6 col(s) : 0 row(s) in hive
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_181]
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_181]
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_181]
	at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_181]
	at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) ~[scala-library-2.12.8.jar:?]
	at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_181]
	at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_181]
	at scala.collection.immutable.Range.foreach(Range.scala:158) ~[scala-library-2.12.8.jar:?]
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [scala-library-2.12.8.jar:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_181]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [?:1.8.0_181]
	at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_181]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_181]
	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_181]
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/util/StopWatch
	... 44 more
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381) ~[?:1.8.0_181]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[?:1.8.0_181]
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) ~[?:1.8.0_181]
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[?:1.8.0_181]
	... 44 more
Congratuaions! Your job : LINKISCLI_codeweaver_hive_0 executed with status succeed and 0 results.
2021-12-30 15:35:08.035 INFO job is completed.
2021-12-30 15:35:08.035 INFO Task creation time(任务创建时间): 2021-12-30 15:34:40, Task scheduling time(任务调度时间): 2021-12-30 15:34:42, Task start time(任务开始时间): 2021-12-30 15:34:42, Mission end time(任务结束时间): 2021-12-30 15:35:08
2021-12-30 15:35:08.035 INFO Your mission(您的任务) 16 The total time spent is(总耗时时间为): 27.92021-12-30 15:35:08.035 INFO Congratulations. Your job completed with status Success.

[INFO] Job execute successfully! Will try get execute result
============Result:================
TaskId:16
ExecId: exec_id018019linkis-cg-entrancebd15-21-32-217:9104LINKISCLI_codeweaver_hive_0
User:codeweaver
Current job status:SUCCEED
extraMsg:
result:

-----------RESULT SET META DATA------------
有数据,但省略
------------END OF RESULT SET META DATA------------

============END OF RESULT SET============

############Execute Success!!!########

有戏!schema出来了但是数据没出来 查了一下报错,是缺少了hadoop-common的包。。。好吧,继续添加

java.lang.NoClassDefFoundError: org/apache/htrace/core/Tracer$Builder

少htrace的包,继续(注意是apache版本的htrace-core4-4.0.1-incubating.jar)

[codeweaver@bd15-21-32-217 bin]$ ./linkis-cli-hive -code "SELECT * from mob_bg_devops.servers_exps_weekly_with_wh;" -submitUser codeweaver -proxyUser codeweaver
[INFO] LogFile path: /home/codeweaver/linkis_scala_2.12/logs/linkis-cli//linkis-client.codeweaver.log.20211230161311279901774
[INFO] User does not provide usr-configuration file. Will use default config
[INFO] connecting to linkis gateway:http://127.0.0.1:9001
JobId:22
TaskId:22
ExecId:exec_id018019linkis-cg-entrancebd15-21-32-217:9104LINKISCLI_codeweaver_hive_0
[INFO] Job is successfully submitted!

2021-12-30 16:13:14.013 INFO Program is substituting variables for you
2021-12-30 16:13:15.013 INFO Variables substitution ended successfully
2021-12-30 16:13:15.013 WARN You submitted a sql without limit, DSS will add limit 5000 to your sql
2021-12-30 16:13:15.013 INFO SQL code check has passed
job is scheduled.
2021-12-30 16:13:16.013 INFO Your job is Scheduled. Please wait it to run.
Job with jobId : LINKISCLI_codeweaver_hive_0 and execID : LINKISCLI_codeweaver_hive_0 submitted
Your job is being scheduled by orchestrator.
2021-12-30 16:13:16.013 INFO You have submitted a new job, script code (after variable substitution) is
************************************SCRIPT CODE************************************
SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
************************************SCRIPT CODE************************************
2021-12-30 16:13:16.013 INFO Your job is accepted,  jobID is LINKISCLI_codeweaver_hive_0 and taskID is 22 in ServiceInstance(linkis-cg-entrance, bd15-21-32-217:9104). Please wait it to be scheduled
2021-12-30 16:13:16.013 INFO job is running.
2021-12-30 16:13:16.013 INFO Your job is Running now. Please wait it to complete.
Job with jobGroupId : 22 and subJobId : 22 was submitted to Orchestrator.
2021-12-30 16:13:16.013 INFO Background is starting a new engine for you, it may take several seconds, please wait
2021-12-30 16:13:38.013 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, bd15-21-32-217:26133) /tmp/codeweaver/linkis_dev/codeweaver/workDir/6f62ac6c-440a-4fbf-9918-f65a0d3cd09c/logs
HiveEngineExecutor_0 >> SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
Time taken: 2.2 秒, begin to fetch results.
Fetched  6 col(s) : 22 row(s) in hive
Your subjob : 22 execue with state succeed, has 1 resultsets.
Congratuaions! Your job : LINKISCLI_codeweaver_hive_0 executed with status succeed and 0 results.
2021-12-30 16:13:41.013 INFO job is completed.
2021-12-30 16:13:41.013 INFO Task creation time(任务创建时间): 2021-12-30 16:13:14, Task scheduling time(任务调度时间): 2021-12-30 16:13:16, Task start time(任务开始时间): 2021-12-30 16:13:16, Mission end time(任务结束时间): 2021-12-30 16:13:41
2021-12-30 16:13:41.013 INFO Your mission(您的任务) 22 The total time spent is(总耗时时间为): 27.32021-12-30 16:13:41.013 INFO Congratulations. Your job completed with status Success.

[INFO] Job execute successfully! Will try get execute result
============Result:================
TaskId:22
ExecId: exec_id018019linkis-cg-entrancebd15-21-32-217:9104LINKISCLI_codeweaver_hive_0
User:codeweaver
Current job status:SUCCEED
extraMsg:
result:

-----------RESULT SET META DATA------------
省略
------------END OF RESULT SET META DATA------------

============RESULT SET============
省略
============END OF RESULT SET============

############Execute Success!!!########

成了!牛逼!

spark引擎

[codeweaver@bd15-21-32-217 bin]$ ./linkis-cli-spark-sql -code "SELECT * from mob_bg_devops.servers_exps_weekly_with_wh;"  -submitUser codeweaver -proxyUser codeweaver --queue default
[INFO] LogFile path: /home/codeweaver/linkis_scala_2.12/logs/linkis-cli//linkis-client.codeweaver.log.20211230121322013002420
[INFO] User does not provide usr-configuration file. Will use default config
[INFO] connecting to linkis gateway:http://127.0.0.1:9001
JobId:62
TaskId:62
ExecId:exec_id018019linkis-cg-entrancebd15-21-32-217:9104LINKISCLI_codeweaver_spark_3
[INFO] Job is successfully submitted!

2021-12-30 12:13:23.013 INFO Program is substituting variables for you
2021-12-30 12:13:23.013 INFO Variables substitution ended successfully
2021-12-30 12:13:24.013 WARN You submitted a sql without limit, DSS will add limit 5000 to your sql
2021-12-30 12:13:24.013 INFO SQL code check has passed
job is scheduled.
2021-12-30 12:13:24.013 INFO Your job is Scheduled. Please wait it to run.
Your job is being scheduled by orchestrator.
Job with jobId : LINKISCLI_codeweaver_spark_3 and execID : LINKISCLI_codeweaver_spark_3 submitted
2021-12-30 12:13:24.013 INFO You have submitted a new job, script code (after variable substitution) is
************************************SCRIPT CODE************************************
SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
************************************SCRIPT CODE************************************
2021-12-30 12:13:24.013 INFO Your job is accepted,  jobID is LINKISCLI_codeweaver_spark_3 and taskID is 62 in ServiceInstance(linkis-cg-entrance, bd15-21-32-217:9104). Please wait it to be scheduled
2021-12-30 12:13:24.013 INFO job is running.
2021-12-30 12:13:24.013 INFO Your job is Running now. Please wait it to complete.
Job with jobGroupId : 62 and subJobId : 63 was submitted to Orchestrator.
2021-12-30 12:13:24.013 INFO Background is starting a new engine for you, it may take several seconds, please wait
2021-12-30 12:13:59.013 INFO EngineConn local log path: ServiceInstance(linkis-cg-engineconn, bd15-21-32-217:29584) /tmp/codeweaver/linkis_dev/codeweaver/workDir/9870f167-940a-4164-bed5-57fac2059bb1/logs
2021-12-30 12:13:59.013 INFO yarn application id: application_1639740855307_145806
bd15-21-32-217:29584 >> SELECT * from mob_bg_devops.servers_exps_weekly_with_wh limit 5000
bd15-21-32-217:29584 >> Time taken: 8466, Fetched 22 row(s).
Your subjob : 63 execue with state succeed, has 1 resultsets.
Congratuaions! Your job : LINKISCLI_codeweaver_spark_3 executed with status succeed and 0 results.
2021-12-30 12:14:13.014 INFO job is completed.
2021-12-30 12:14:13.014 INFO Task creation time(任务创建时间): 2021-12-30 12:13:23, Task scheduling time(任务调度时间): 2021-12-30 12:13:24, Task start time(任务开始时间): 2021-12-30 12:13:24, Mission end time(任务结束时间): 2021-12-30 12:14:13
2021-12-30 12:14:13.014 INFO Your mission(您的任务) 62 The total time spent is(总耗时时间为): 49.22021-12-30 12:14:13.014 INFO Congratulations. Your job completed with status Success.

[INFO] Job execute successfully! Will try get execute result
============Result:================
TaskId:62
ExecId: exec_id018019linkis-cg-entrancebd15-21-32-217:9104LINKISCLI_codeweaver_spark_3
User:codeweaver
Current job status:SUCCEED
extraMsg:
result:

-----------RESULT SET META DATA------------
省略
------------END OF RESULT SET META DATA------------

============RESULT SET============
省略
============END OF RESULT SET============

############Execute Success!!!########

成功~激动之心溢于言表

总结陈词

经过三天的测试(踩坑),终于成功把linkis 1.0.2部署完成了 当中的确遇到了许多问题,简单总结一下

  • 首先吐槽下,我们公司用的各个组件版本实在是太老了,真的是各种问题
  • 确认各个组件的版本,cdh版本请尽量使用cloudera的版本的依赖
  • scala版本和spark版本保持一致
  • 各个引擎的版本在启动前,确认配置、文件夹、数据库中的版本都是一致的
  • 引擎编译完成后可能会少依赖,手动添加后,删除.zip重启即可

linkis官方的1.0.3版本也即将发布,我也会继续跟进体验(踩坑),也希望与微众以及linkis的各位大佬继续深入交流。希望linkis早日从apache毕业!我们下一期再见!