sparkJobSever启动是会常驻一个SparkSubmit进程,我们在创建context上下文的时候可以引起这个进程因此不用手动提交Job
1.本文主要介绍spark job sever的调度
(1) 创建一个context
curl -d "" 'localhost:8090/contexts/test-context?num-cpu-cores=4&memory-per-node=512m'
以上的Post请求也可 http://localhost:8090/contexts/test-context (postman模拟)
(2).上传jar包到Job-Server
curl --data-binary @/Users/Desktop/SparkTest.jar localhost:8090/jars/sparkTest
*localhost:8090/jars/appName 上传jar需要给一个别名
(3)调用jar
curl -d "input.string = a b c a b see" 'localhost:8090/jobs?appName=test&classPath=spark.jobserver.WordCountExample&context=test-context&sync=true'
{
"result": {
"a": 2,
"b": 2,
"c": 1,
"see": 1
}
}
2.关于spark job sever的安装可以参考Github的使用
https://github.com/spark-jobserver/spark-jobserver/blob/master/doc/chinese/job-server.md
部署的两种方式
server_deploy.sh
deploys job server to a directory on a remote host.server_package.sh
deploys job server to a local directory, from which you can deploy the directory, or create a .tar.gz for Mesos or YARN deployment.
(1).conf/local.sh.template
文件到 local.sh
并修改路径下的Spark_Home参数根据本机的环境来
(2).config/local.conf.template
到 <environment>.conf
修改home参数$SPARK_HOME
(3). bin/server_package.sh local 执行打包命令