搭建Gateway提交作业到E-MapReduce集群
Gateway
一些客户需要自主搭建Gateway向E-MapReduce集群提交作业,目前E-MapReduce在产品页面上不支持购买Gateway,后续可以在产品上直接购买Gateway,并把Hadoop环境准备好供用户使用。
网络
首先要保证Gateway机器在EMR对应集群的安全组中,Gateway节点可以顺利的访问EMR集群。设置机器的安全组请参考ECS的安全组设置说明。
环境
Java环境
安装至少JDK 1.7及以上
-
EMR-3.2.0以下的版本
复制所有的依赖的hadoop的包到gateway
scp -r root@masterip:/opt/apps/extra-jars /opt/apps/ scp -r root@masterip:/opt/apps/hadoop-2.7.2 /opt/apps/ scp -r root@masterip:/opt/apps/apache-hive-2.0.1-bin /opt/apps/ scp -r root@masterip:/opt/apps/spark-2.1.1-bin-hadoop2.7 /opt/apps/ ln -s /opt/apps/hadoop-2.7.2 /usr/lib/hadoop-current/ ln -s /opt/apps/apache-hive-2.0.1-bin /usr/lib/hadoop-current/ ln -s /opt/apps/spark-2.1.1-bin-hadoop2.7 /usr/lib/spark-current/
复制配置文件到gateway
scp -r root@masterip:/etc/emr/hadoop-conf-2.7.2 /etc/emr/hadoop-conf-2.7.2 ln -s /etc/emr/hadoop-conf-2.7.2 /etc/emr/hadoop-conf scp -r root@masterip:/etc/emr/hive-conf-2.0.1 /etc/emr/hive-conf-2.0.1/ ln -s /etc/emr/hive-conf-2.0.1 /etc/emr/hive-conf
复制环境变量到gateway
scp root@masterip:/etc/profile.d/hadoop.sh /etc/profile.d/
- EMR-3.2.0以及以上的版本
复制所有的依赖的hadoop的包到gateway
scp -r root@masterip:/opt/apps/extra-jars /opt/apps/ scp -r root@masterip:/opt/apps/ecm/service/hadoop/2.7.2/package/hadoop-2.7.2/ /opt/apps/ scp -r root@masterip:/opt/apps/ecm/service/hive/2.0.1/package/apache-hive-2.0.1-bin/ /opt/apps/ scp -r root@masterip:/opt/apps/ecm/service/spark/2.1.1/package/spark-2.1.1-bin-hadoop2.7/ /opt/apps/ ln -s /opt/apps/hadoop-2.7.2 /usr/lib/hadoop-current/ ln -s /opt/apps/apache-hive-2.0.1-bin /usr/lib/hadoop-current/ ln -s /opt/apps/spark-2.1.1-bin-hadoop2.7 /usr/lib/spark-current/
复制配置文件到gatewayscp -r root@masterip:/etc/ecm/hadoop-conf-2.7.2 /etc/ecm/hadoop-conf-2.7.2 ln -s /etc/ecm/hadoop-conf-2.7.2 /etc/ecm/hadoop-conf scp -r root@masterip:/etc/ecm/hive-conf-2.0.1 /etc/ecm/hive-conf-2.0.1/ ln -s /etc/ecm/hive-conf-2.0.1 /etc/ecm/hive-conf
复制环境变量到gateway
scp root@masterip:/etc/profile.d/hdfs.sh /etc/profile.d/ scp root@masterip:/etc/profile.d/yarn.sh /etc/profile.d/ scp root@masterip:/etc/profile.d/hive.sh /etc/profile.d/ scp root@masterip:/etc/profile.d/spark.sh /etc/profile.d/
- EMR-3.2.0以及以上的版本
复制所有的依赖的hadoop的包到gateway
Host修改
修改hosts配置将集群的master节点中的host内容复制到gateway的/etc/hosts中
#start add cluster host of cluster 22663,Mon May 30 19:21:51 CST 2016
xx.yy.zz.tt emr-header-1.cluster-1212 emr-header-1 xxxxxxxx1
xx.yy.zz.tt1 emr-worker-2.cluster-1212 emr-worker-2 emr-header-3 xxxxxxx2
xx.yy.zz.tt2 emr-worker-1.cluster-1212 emr-worker-1 emr-header-2 xxxxxxxx3
#end add cluster host
完成以上以后,配置就完成了。
测试
- Hive
[hadoop@iZ23bc05hrvZ ~]$ hive hive> show databases; OK default Time taken: 1.124 seconds, Fetched: 1 row(s) hive> create database school; OK Time taken: 0.362 seconds hive>
- 运行Hadoop作业
[hadoop@iZ23bc05hrvZ ~]$ hadoop jar /usr/lib/hadoop-current/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar pi 10 10 Number of Maps = 10 Samples per Map = 10 Wrote input for Map #0 Wrote input for Map #1 Wrote input for Map #2 Wrote input for Map #3 Wrote input for Map #4 Wrote input for Map #5 Wrote input for Map #6 Wrote input for Map #7 Wrote input for Map #8 Wrote input for Map #9 File Input Format Counters Bytes Read=1180 File Output Format Counters Bytes Written=97 Job Finished in 29.798 seconds Estimated value of Pi is 3.20000000000000000000
最后更新:2017-06-14 21:06:14