spark,zeppelin环境的配置

系统环境为Ubuntu16.04 1.安装java,从官网下载java包,放下/opt目录下,解压,往~/.zshrc文件添加环境变量,如下

export JAVA_HOME=/opt/jdk1.8.0_144
export JRE_HOME=$JAVA_HOME/jre
export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib
export PATH="/opt/jdk1.8.0_144/bin:$PATH"

2.scala安装,下载安装包,解压到/opt/目录下,往~/.zshrc文件添加环境变量,如下

export SCALA_HOME=/opt/scala-2.11.11
export PATH="/opt/scala-2.11.11/bin:$PATH"

3.下载spark(http://spark.apache.org/downloads.html),解压到/opt/,往~/.zshrc文件添加环境变量,如下

export SPARK_HOME=/opt/spark-2.1.0-bin-hadoop2.7
export PATH="/opt/spark-2.1.0-bin-hadoop2.7/bin:$PATH"
export PYSPARK_PYTHON=python3 //表示使用python3
export PYSPARK_DRIVER_PYTHON=ipython3

4.下载zeppelin,解压到/opt/,往~/.zshrc文件添加环境变量,如下

export PATH="/opt/zeppelin-0.7.2-bin-all/bin:$PATH"

5.运行zeppelin,

sudo chown -R hys:hys zeppelin-0.7.2-bin-all

zeppelin-daemon.sh start

6.访问http://127.0.0.1:8080出现如下界面则表示安装成功。 image