搭建Gateway提交作業到E-MapReduce集群
Gateway
一些客戶需要自主搭建Gateway向E-MapReduce集群提交作業,目前E-MapReduce在產品頁麵上不支持購買Gateway,後續可以在產品上直接購買Gateway,並把Hadoop環境準備好供用戶使用。
網絡
首先要保證Gateway機器在EMR對應集群的安全組中,Gateway節點可以順利的訪問EMR集群。設置機器的安全組請參考ECS的安全組設置說明。
環境
Java環境
安裝至少JDK 1.7及以上
-
EMR-3.2.0以下的版本
複製所有的依賴的hadoop的包到gateway
scp -r root@masterip:/opt/apps/extra-jars /opt/apps/ scp -r root@masterip:/opt/apps/hadoop-2.7.2 /opt/apps/ scp -r root@masterip:/opt/apps/apache-hive-2.0.1-bin /opt/apps/ scp -r root@masterip:/opt/apps/spark-2.1.1-bin-hadoop2.7 /opt/apps/ ln -s /opt/apps/hadoop-2.7.2 /usr/lib/hadoop-current/ ln -s /opt/apps/apache-hive-2.0.1-bin /usr/lib/hadoop-current/ ln -s /opt/apps/spark-2.1.1-bin-hadoop2.7 /usr/lib/spark-current/
複製配置文件到gateway
scp -r root@masterip:/etc/emr/hadoop-conf-2.7.2 /etc/emr/hadoop-conf-2.7.2 ln -s /etc/emr/hadoop-conf-2.7.2 /etc/emr/hadoop-conf scp -r root@masterip:/etc/emr/hive-conf-2.0.1 /etc/emr/hive-conf-2.0.1/ ln -s /etc/emr/hive-conf-2.0.1 /etc/emr/hive-conf
複製環境變量到gateway
scp root@masterip:/etc/profile.d/hadoop.sh /etc/profile.d/
- EMR-3.2.0以及以上的版本
複製所有的依賴的hadoop的包到gateway
scp -r root@masterip:/opt/apps/extra-jars /opt/apps/ scp -r root@masterip:/opt/apps/ecm/service/hadoop/2.7.2/package/hadoop-2.7.2/ /opt/apps/ scp -r root@masterip:/opt/apps/ecm/service/hive/2.0.1/package/apache-hive-2.0.1-bin/ /opt/apps/ scp -r root@masterip:/opt/apps/ecm/service/spark/2.1.1/package/spark-2.1.1-bin-hadoop2.7/ /opt/apps/ ln -s /opt/apps/hadoop-2.7.2 /usr/lib/hadoop-current/ ln -s /opt/apps/apache-hive-2.0.1-bin /usr/lib/hadoop-current/ ln -s /opt/apps/spark-2.1.1-bin-hadoop2.7 /usr/lib/spark-current/
複製配置文件到gatewayscp -r root@masterip:/etc/ecm/hadoop-conf-2.7.2 /etc/ecm/hadoop-conf-2.7.2 ln -s /etc/ecm/hadoop-conf-2.7.2 /etc/ecm/hadoop-conf scp -r root@masterip:/etc/ecm/hive-conf-2.0.1 /etc/ecm/hive-conf-2.0.1/ ln -s /etc/ecm/hive-conf-2.0.1 /etc/ecm/hive-conf
複製環境變量到gateway
scp root@masterip:/etc/profile.d/hdfs.sh /etc/profile.d/ scp root@masterip:/etc/profile.d/yarn.sh /etc/profile.d/ scp root@masterip:/etc/profile.d/hive.sh /etc/profile.d/ scp root@masterip:/etc/profile.d/spark.sh /etc/profile.d/
- EMR-3.2.0以及以上的版本
複製所有的依賴的hadoop的包到gateway
Host修改
修改hosts配置將集群的master節點中的host內容複製到gateway的/etc/hosts中
#start add cluster host of cluster 22663,Mon May 30 19:21:51 CST 2016
xx.yy.zz.tt emr-header-1.cluster-1212 emr-header-1 xxxxxxxx1
xx.yy.zz.tt1 emr-worker-2.cluster-1212 emr-worker-2 emr-header-3 xxxxxxx2
xx.yy.zz.tt2 emr-worker-1.cluster-1212 emr-worker-1 emr-header-2 xxxxxxxx3
#end add cluster host
完成以上以後,配置就完成了。
測試
- Hive
[hadoop@iZ23bc05hrvZ ~]$ hive hive> show databases; OK default Time taken: 1.124 seconds, Fetched: 1 row(s) hive> create database school; OK Time taken: 0.362 seconds hive>
- 運行Hadoop作業
[hadoop@iZ23bc05hrvZ ~]$ hadoop jar /usr/lib/hadoop-current/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar pi 10 10 Number of Maps = 10 Samples per Map = 10 Wrote input for Map #0 Wrote input for Map #1 Wrote input for Map #2 Wrote input for Map #3 Wrote input for Map #4 Wrote input for Map #5 Wrote input for Map #6 Wrote input for Map #7 Wrote input for Map #8 Wrote input for Map #9 File Input Format Counters Bytes Read=1180 File Output Format Counters Bytes Written=97 Job Finished in 29.798 seconds Estimated value of Pi is 3.20000000000000000000
最後更新:2017-06-14 21:06:14