搭建单机hive环境
【摘要】 搭建单机hive环境本文假设搭建的路径在**/opt/sh**路径,并已经下载hive安装包及hadoop安装包https://downloads.apache.org/hive/hive-3.1.3/apache-hive-3.1.3-bin.tar.gzhttps://downloads.apache.org/hadoop/common/hadoop-3.3.4/hadoop-3.3....
搭建单机hive环境
本文假设搭建的路径在**/opt/sh**路径,并已经下载hive安装包及hadoop安装包
- https://downloads.apache.org/hive/hive-3.1.3/apache-hive-3.1.3-bin.tar.gz
- https://downloads.apache.org/hadoop/common/hadoop-3.3.4/hadoop-3.3.4.tar.gz
下载解压安装包
wget https://downloads.apache.org/hadoop/common/hadoop-3.3.4/hadoop-3.3.4.tar.gz
mkdir -p /opt/sh/hadoop
tar -xf hadoop-3.3.4.tar.gz -C /opt/sh/hadoop --strip-components 1
wget https://downloads.apache.org/hive/hive-3.1.3/apache-hive-3.1.3-bin.tar.gz
mkdir -p /opt/sh/hive
tar -xf apache-hive-3.1.3-bin.tar.gz -C /opt/sh/hive --strip-components 1
导入环境变量
export HADOOP_HOME=/opt/sh/hadoop
export HIVE_HOME=/opt/sh/hive
放置两个文件在对应路径下
$HADOOP_HOME/etc/hadoop/core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
$HADOOP_HOME/etc/hadoop/hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
配置单机ssh
ssh-keygen -t rsa -f /etc/ssh/ssh_host_rsa_key -P ""
ssh-keygen -t ecdsa -f /etc/ssh/ssh_host_ecdsa_key -P ""
ssh-keygen -t ed25519 -f /etc/ssh/ssh_host_ed25519_key -P ""
mkdir -p /root/.ssh
ssh-keygen -t rsa -f /root/.ssh/id_rsa -P ""
cat /root/.ssh/id_rsa.pub >> /root/.ssh/authorized_keys
放置文件到/etc/ssh/ssh_config
Host *
CheckHostIP no
PasswordAuthentication no
IdentityFile ~/.ssh/id_rsa
StrictHostKeyChecking no
Port 22
Include /etc/ssh/ssh_config.d/*.conf
启动ssh server
/usr/sbin/sshd
如果要以root模式启动,请配置如下环境变量
export HDFS_NAMENODE_USER=root
export HDFS_DATANODE_USER=root
export HDFS_SECONDARYNAMENODE_USER=root
export YARN_RESOURCEMANAGER_USER=root
export YARN_NODEMANAGER_USER=root
启动hadoop
$HADOOP_HOME/bin/hdfs namenode -format
$HADOOP_HOME/sbin/start-dfs.sh
如果出现报错ERROR: JAVA_HOME is not set and could not be found
在hadoop-env.sh
中重新声明一下环境变量
echo "export JAVA_HOME=/etc/alternatives/jre_11_openjdk" >> $HADOOP_HOME/etc/hadoop/hadoop-env.sh
启动完成后,可访问localhost:9870
来登录hadoop portal
启动Hive
$HIVE_HOME/bin/hive
关键端口
- 9870 hadoop 界面地址
- 39303
【版权声明】本文为华为云社区用户原创内容,转载时必须标注文章的来源(华为云社区)、文章链接、文章作者等基本信息, 否则作者和本社区有权追究责任。如果您发现本社区中有涉嫌抄袭的内容,欢迎发送邮件进行举报,并提供相关证据,一经查实,本社区将立刻删除涉嫌侵权内容,举报邮箱:
cloudbbs@huaweicloud.com
- 点赞
- 收藏
- 关注作者
评论(0)