因为CDH环境为6.3版本,所以选择/apache-kylin-3.0.1-bin-cdh60包
从国内镜像下载
wget https://mirrors.tuna.tsinghua.edu.cn/apache/kylin/apache-kylin-3.0.1/apache-kylin-3.0.1-bin-cdh60.tar.gz
解压,我解压到/opt/kylin/apache-kylin-3.0.1-bin-cdh60目录
配置全局环境变量:
cat << EOF | sudo tee -a /etc/profile
#kylin config
JAVA_HOME=/usr/java/jdk1.8.0_181-cloudera
export CLASSPATH=.:$JAVA_HOME/lib:$JAVA_HOME/jre/lib:$CLASSPATH
export KYLIN_HOME=/opt/kylin/apache-kylin-3.0.1-bin-cdh60
export PATH=$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH
export CDH_HOME=/opt/cloudera/parcels/CDH
export HBASE_HOME=${CDH_HOME}/lib/hbase
export HBASE_CLASSPATH=${HBASE_HOME}/lib/hbase-common-2.1.0-cdh6.3.2.jar
export HADOOP_USER_NAME=hdfs
export HADOOP_USER_NAME=hdfs
EOF
source /etc/profile
查看hbase包的路径:
[root@cdht1 lib]# ls -l hbase-common-*
lrwxrwxrwx 1 root root 45 Nov 9 00:09 hbase-common-2.1.0-cdh6.3.2.jar -> ../../../jars/hbase-common-2.1.0-cdh6.3.2.jar
lrwxrwxrwx 1 root root 51 Nov 9 00:10 hbase-common-2.1.0-cdh6.3.2-tests.jar -> ../../../jars/hbase-common-2.1.0-cdh6.3.2-tests.jar
[root@cdht1 lib]# pwd
/opt/cloudera/parcels/CDH/lib/hbase/lib
检查环境变量:
[root@cdht1 apache-kylin-3.0.1-bin-cdh60]# $KYLIN_HOME/bin/check-env.sh
Retrieving hadoop conf dir...
Error: Could not find or load main class org.apache.hadoop.hbase.util.GetJavaProperty
KYLIN_HOME is set to /opt/kylin/apache-kylin-3.0.1-bin-cdh60
$KYLIN_HOME/bin/kylin.sh start
Retrieving Spark dependency...
spark not found, set SPARK_HOME, or run bin/download-spark.sh
安装apark或者配置spark环境变量
export SPARK_HOME=/opt/cloudera/parcels/CDH/lib/spark
启动成功:
A new Kylin instance is started by root. To stop it, run 'kylin.sh stop'
Check the log at /opt/kylin/apache-kylin-3.0.1-bin-cdh60/logs/kylin.log
Web UI is at http://localhost:7070/kylin。 默认用户名密码ADMIN/KYLIN
访问页面日志报错,页面无法打开:
2020-03-25 16:46:54,880 INFO [localhost-startStop-1] metrics.MetricsManager:142 : Kylin metrics monitor is not enabled
java.lang.NoClassDefFoundError: org/apache/commons/configuration/ConfigurationException
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1628)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:483)
这个错误和一个 bug 有关,原因是 Kylin 2.6.1 开始默认不自带 Spark 客户端,而使用 Ambari 的 Spark 客户端,又和 Kylin 出现了不兼容的情况。解决方法是用 Kylin 自带的spark download 脚本 重新下载一个 spark 到$KYLIN_HOME/spark,并把 SPARK_HOME 指向该目录。
export SPARK_HOME=${KYLIN_HOME}/spark
需要复制spark/jars/commons-configuration-1.6.jar文件到tomcat/lib下,启动成功
spark可以不重新下载,但是需要旧版本的commons-configuration-1.6.jar文件,不然始终报错java.lang.NoClassDefFoundError: org/apache/commons/configuration/ConfigurationException