本文共 2821 字,大约阅读时间需要 9 分钟。
hadoop+spark环境--单实例版
1、修改主机名及关系映射2、关闭防火墙并创建文件夹mkdir /hadoop/tmpmkdir /hadoop/dfs/namemkdir /hadoop/dfs/datamkdir /hadoop/var3、配置Scala环境[root@hadoop conf]#vim /etc/profileexport SCALA_HOME=/opt/scala2.11.12export PATH=.:${JAVA_HOME}/bin:${SCALA_HOME}/bin:$PATH[root@hadoop conf]#scala -version //查看是否安装成功4、配置Spark环境/usr/java/jdk1.8.0_201-amd64[root@hadoop conf]# vim /etc/profileexport SPARK_HOME=/opt/spark2.2.3export PATH=.:${JAVA_HOME}/bin:${SCALA_HOME}/bin:${SPARK_HOME}/bin:$PATH[root@hadoop conf]#vim /opt/spark2.2.3/conf/spark-env.shexport SCALA_HOME=/opt/scala2.11.12export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/export HADOOP_HOME=/opt/hadoop2.7.6 export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop export SPARK_HOME=/opt/spark2.2.3export SPARK_MASTER_IP=hadoop.master //主机名称export SPARK_EXECUTOR_MEMORY=1G //设置运行内存5、配置Hadoop环境[root@hadoop hadoop2.7.6]# vim /etc/profileexport HADOOP_HOME=/opt/hadoop2.7.6export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/nativeexport HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"export PATH=${HADOOP_HOME}/bin5.1 修改cor-site.xml文件[root@hadoop hadoop]# vim core-site.xml<property><name>hadoop.tmp.dir</name><value>/hadoop/tmp</value><description>Abase for other temporary directories.</description></property><property><name>fs.default.name</name><value>hdfs://hadoop.master:9000</value></property>5.2 修改hadoop-env.sh文件/usr/java/jdk1.8.0_201-amd64[root@hadoop hadoop]# vim hadoop-env.shexport JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.9.x86_64/5.3 修改hdfs-site.xml文件[root@hadoop hadoop]# vim hdfs-site.xml<property><name>dfs.name.dir</name><value>/hadoop/dfs/name</value><description>Path on the local filesystem where theNameNode stores the namespace and transactions logs persistently.</description></property><property><name>dfs.data.dir</name><value>/hadoop/dfs/data</value><description>Comma separated list of paths on the localfilesystem of a DataNode where it should store its blocks.</description></property><property><name>dfs.replication</name><value>2</value></property><property><name>dfs.permissions</name><value>false</value><description>need not permissions</description></property>5.4 修改mapred-site.xml文件[root@hadoop hadoop]# vim mapred-site.xml<property><name>mapred.job.tracker</name><value>hadoop.master:9001</value></property><property><name>mapred.local.dir</name><value>/hadoop/var</value></property><property><name>mapreduce.framework.name</name><value>yarn</value></property>6、 启动Hadoopcd /opt/hadoop2.7.6/bin./hadoop namenode -formatcd /opt/hadoop2.7.6/sbin启动hdfs和yarnstart-dfs.shstart-yarn.sh查看验证正常在浏览器输入192.168.47.45:8088 和192.168.47.45:50070 界面查看是否能访问 7、 启动Sparkcd /opt/spark2.2.3/sbinstart-all.sh转载于:https://blog.51cto.com/14074978/2373818