Skip to content

Commit

Permalink
change: 1.更新readme,增加spark作业提交到yarn上跑的说明。 2.优化代码注释
Browse files Browse the repository at this point in the history
  • Loading branch information
Kyofin committed Jun 26, 2019
1 parent e636d39 commit 97b1f0d
Show file tree
Hide file tree
Showing 2 changed files with 20 additions and 3 deletions.
18 changes: 17 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -151,4 +151,20 @@ SparkSession spark = SparkSession
4.打开spark server界面,可以看到已经完成的spark作业。
![](https://raw.githubusercontent.com/huzekang/picbed/master/20190626112849.png)

### 提交作业到yarn
### 提交作业到yarn
1.代码中定义的上下文不要指定master
```java
SparkSession spark = SparkSession
.builder()
.appName("Java Spark SQL Starter !!")
.enableHiveSupport()
.config("spark.some.config.option", "some-value")
.getOrCreate();
```

2.使用`mvn clean package`打包好的作业,并提交到本地安装好的spark环境上跑
```
~/opt/spark-2.4.0-bin-hadoop2.7 » bin/spark-submit --master yarn --deploy-mode cluster --class "com.wugui.sparkstarter.SparkHiveNewVersion" /Users/huzekang/study/spark-starter/target/spark-starter-1.0-SNAPSHOT.jar
```
3.打开yarn观察到作业已经完成了。
![](https://raw.githubusercontent.com/huzekang/picbed/master/20190626133707.png)
5 changes: 3 additions & 2 deletions src/main/java/com/wugui/sparkstarter/SparkHiveNewVersion.java
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,10 @@ public static void main(String[] args) {
// 定义上下文
SparkSession spark = SparkSession
.builder()
// 如果需要提交到remote spark则使用spark://host:port
// 如果需要作业要以jar包形式提交到remote spark,则使用spark://host:port
// .master("spark://10.0.0.50:7077")
// 如果需要提交到remote spark则使用local
// 如果idea中测试则使用local。
// 如果作业要以jar包形式提交到yarn则不设置master。
.master("local")
.appName("Java Spark SQL Starter !!")
.enableHiveSupport()
Expand Down

0 comments on commit 97b1f0d

Please sign in to comment.