ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 0600 ~/.ssh/authorized_keys
brew install hadoop
下载完后hadoop的目录在/usr/local/Cellar/hadoop/
目录:/usr/local/Cellar/hadoop/${版本号}/libexec/etc/hadoop
例如:/Library/Java/JavaVirtualMachines/jdk1.8.0_211.jdk/Contents/Home
在hadoop-env.sh中配置JAVA_HOME
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_211.jdk/Contents/Home
fs.defaultFS
hdfs://localhost:9000
dfs.replication
1
mapreduce.framework.name
yarn
yarn.nodemanager.aux-services
mapreduce_shuffle
yarn.nodemanager.env-whitelist
JAVA_HOME,HADOOP_COMMON_HOME,HADOOP_HDFS_HOME,HADOOP_CONF_DIR,CLASSPATH_PREPEND_DISTCACHE,HADOOP_YARN_HOME,HADOOP_MAPRED_HOME
bin/hdfs namenode -format
sbin/start-dfs.sh
访问http://localhost:9870就能看到页面
sbin/start-yarn.sh
输入http://localhost:8088 就能看到应用管理的页面
brew install hive
hive
出现hive >表示成功
我安装是碰到了两种异常
1、Exception in thread “main” java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)
解决方案:此问题是应为Hadoop中的Guava-xxx.jar和Hive中的Guava-xxx.jar版本不一致造成的,将hadoop中lib/common下高版本的Guava jar包复制到hive的lib目录下,再将原来的jar包删除即可
2、Exception in thread “main” java.lang.ClassCastException: java.base/jdk.internal.loader.ClassLoaders$AppClassLoader cannot be cast to java.base/java.net.URLClassLoader
解决方案:在hadoop-env.xml 配置好JAVA_HOME即可
退出
输入quit;
hive的元数据是存储在数据库上面,Hive默认用derby作为元数据库,这里我们用mysql来存储元数据。
mysql下载地址
安装完后配置环境变量
export PATH=$PATH:/usr/local/mysql/bin
export PATH=$PATH:/usr/local/mysql/support-files
登录mysql
mysql -u root -p
create database metastore;
在hive的conf目录下 创建hive-site.xml文件
hive.metastore.local
true
javax.jdo.option.ConnectionURL
jdbc:mysql://localhost/metastore
javax.jdo.option.ConnectionDriverName
com.mysql.jdbc.Driver
javax.jdo.option.ConnectionUserName
root
javax.jdo.option.ConnectionPassword
123456
hive.exec.local.scratchdir
/tmp/hive
hive.querylog.location
/tmp/hive
hive.downloaded.resources.dir
/tmp/hive
hive.server2.logging.operation.log.location
/tmp/hive
下载地址:https://dev.mysql.com/downloads/connector/j/
解压后将jar包拷贝到hive目录的lib下
直接在终端执行,schematool是hive的命令
schematool -initSchema -dbType mysql
进入hive试试吧
hive
hive > show databases;