文章导读
Spark安装与配置需要提前安装Scala支持,需要在sca要在scala官网下载安装包,在Spark官网下载安装包.
Scala安装
# 解压
[root@hadoop01 src]# tar -zxvf scala-2.11.12.tgz
# 移动至指定目录
[root@hadoop01 src]# mv scala-2.11.12 ../soft/
# 修改环境变量
[root@hadoop01 scala-2.11.12]# vim /etc/profile
SCALA_HOME=/usr/local/soft/scala-2.11.12
PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$ZOOKEEPER_HOME/bin:$HADOOP_HOME/bin:$HIVE_HOME/bin:$SCALA_HOME/bin
export JAVA_HOME JRE_HOME ZOOKEEPER_HOME HADOOP_HOME HIVE_HOME SCALA_HOME CLASSPATH PATH
# 刷新环境变量
[root@hadoop01 ~]# source /etc/profile
# 验证
[root@hadoop03 ~]# scala -version
# 将安装信息拷贝至hadoop02 和 hadoop03
[root@hadoop01 scala-2.11.12]# scp -r /usr/local/soft/scala-2.11.12 root@hadoop02:/usr/local/soft/
[root@hadoop01 scala-2.11.12]# scp -r /usr/local/soft/scala-2.11.12 root@hadoop03:/usr/local/soft/
# 在对应的机器上刷新环境变量
[root@hadoop02 ~]# source /etc/profile
[root@hadoop03 ~]# source /etc/profile
Spark安装
将Spark解压至/usr/local/soft/目录下,进入spark-2.4.8-bin-hadoop2.7/conf路径,修改Spark配置文件:
[root@hadoop01 conf]# mv spark-env.sh.template spark-env.sh
[root@hadoop01 conf]# vim spark-env.sh
# 添加配置信息如下
export SCALA_HOME=/usr/local/soft/scala-2.11.12
export JAVA_HOME=/usr/local/soft/jdk1.8.0_333
export HADOOP_HOME=/usr/local/soft/hadoop-2.10.1
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export SPARK_MASTER_IP=hadoop01
export SPARK_LOCAL_DIRS=/usr/local/soft/spark-2.4.8-bin-hadoop2.7
export SPARK_DRIVER_MEMORY=1G
[root@hadoop01 conf]# mv slaves.template slaves
[root@hadoop01 conf]# vim slaves
# 删除localhost 添加slave节点
hadoop02
hadoop03
# 将配置好的Spark安装包拷贝分发到从节点
[root@hadoop01 conf]# scp -r /usr/local/soft/spark-2.4.8-bin-hadoop2.7 root@hadoop02:/usr/local/soft/
[root@hadoop01 conf]# scp -r /usr/local/soft/spark-2.4.8-bin-hadoop2.7 root@hadoop03:/usr/local/soft/
# 修改环境变量 在hadoop01、hadoop02、hadoop03上同步修改
[root@hadoop01 conf]# vim /etc/profile
SPARK_HOME=/usr/local/soft/spark-2.4.8-bin-hadoop2.7
PATH=$PATH:$JAVA_HOME/bin:$JRE_HOME/bin:$ZOOKEEPER_HOME/bin:$HADOOP_HOME/bin:$HIVE_HOME/bin:$SCALA_HOME/bin:$SPARK_HOME/bin
export JAVA_HOME JRE_HOME ZOOKEEPER_HOME HADOOP_HOME HIVE_HOME SCALA_HOME SPARK_HOME CLASSPATH PATH
# 修改完成后刷新
[root@hadoop01 conf]# source /etc/profile
启动Spark,启动后jps查看主节点和从节点是否有Master和Worker进程:
/usr/local/soft/spark-2.4.8-bin-hadoop2.7/sbin
[root@hadoop01 sbin]# ./start-all.sh
全部评论