這篇文章主要講解了“Standalone client模式下怎么提交spark程序”,文中的講解內(nèi)容簡單清晰,易于學(xué)習(xí)與理解,下面請大家跟著小編的思路慢慢深入,一起來研究和學(xué)習(xí)“Standalone client模式下怎么提交spark程序”吧!
創(chuàng)新互聯(lián)建站主要從事網(wǎng)站制作、網(wǎng)站建設(shè)、網(wǎng)頁設(shè)計(jì)、企業(yè)做網(wǎng)站、公司建網(wǎng)站等業(yè)務(wù)。立足成都服務(wù)五峰,十年網(wǎng)站建設(shè)經(jīng)驗(yàn),價(jià)格優(yōu)惠、服務(wù)專業(yè),歡迎來電咨詢建站服務(wù):13518219792
standalone client模式下,使用ClientApp提交spark程序。
此類在deploy/Client.scala文件中。
private[spark] class ClientApp extends SparkApplication { override def start(args: Array[String], conf: SparkConf): Unit = { val driverArgs = new ClientArguments(args) val rpcEnv = RpcEnv.create("driverClient", Utils.localHostName(), 0, conf, new SecurityManager(conf)) val masterEndpoints = driverArgs.masters.map(RpcAddress.fromSparkURL). map(rpcEnv.setupEndpointRef(_, Master.ENDPOINT_NAME)) rpcEnv.setupEndpoint("client", new ClientEndpoint(rpcEnv, driverArgs, masterEndpoints, conf)) rpcEnv.awaitTermination() } }
代碼很簡單。start方法就是創(chuàng)建一個(gè)ClientEndpoint,然后與Master交互。
ClientEndpoint的主要功能和方法:
override def onStart(): Unit = { driverArgs.cmd match { case "launch" => // TODO: We could add an env variable here and intercept it in `sc.addJar` that would // truncate filesystem paths similar to what YARN does. For now, we just require // people call `addJar` assuming the jar is in the same directory. val mainClass = "org.apache.spark.deploy.worker.DriverWrapper" val classPathConf = config.DRIVER_CLASS_PATH.key val classPathEntries = getProperty(classPathConf, conf).toSeq.flatMap { cp => cp.split(java.io.File.pathSeparator) } val libraryPathConf = config.DRIVER_LIBRARY_PATH.key val libraryPathEntries = getProperty(libraryPathConf, conf).toSeq.flatMap { cp => cp.split(java.io.File.pathSeparator) } val extraJavaOptsConf = config.DRIVER_JAVA_OPTIONS.key val extraJavaOpts = getProperty(extraJavaOptsConf, conf) .map(Utils.splitCommandString).getOrElse(Seq.empty) val sparkJavaOpts = Utils.sparkJavaOpts(conf) val javaOpts = sparkJavaOpts ++ extraJavaOpts val command = new Command(mainClass, Seq("{{WORKER_URL}}", "{{USER_JAR}}", driverArgs.mainClass) ++ driverArgs.driverOptions, sys.env, classPathEntries, libraryPathEntries, javaOpts) val driverResourceReqs = ResourceUtils.parseResourceRequirements(conf, config.SPARK_DRIVER_PREFIX) val driverDescription = new DriverDescription( driverArgs.jarUrl, driverArgs.memory, driverArgs.cores, driverArgs.supervise, command, driverResourceReqs) asyncSendToMasterAndForwardReply[SubmitDriverResponse]( RequestSubmitDriver(driverDescription)) case "kill" => val driverId = driverArgs.driverId asyncSendToMasterAndForwardReply[KillDriverResponse](RequestKillDriver(driverId)) }
封裝一個(gè)org.apache.spark.deploy.worker.DriverWrapper類,然后將DriverDescription消息發(fā)送給Master,在Master上啟動(dòng)這個(gè)DriverWrapper。DriverWrapper很簡單了,這里就不細(xì)說了,作用就是起一個(gè)線程,執(zhí)行我們的spark程序的main方法。
感謝各位的閱讀,以上就是“Standalone client模式下怎么提交spark程序”的內(nèi)容了,經(jīng)過本文的學(xué)習(xí)后,相信大家對Standalone client模式下怎么提交spark程序這一問題有了更深刻的體會(huì),具體使用情況還需要大家實(shí)踐驗(yàn)證。這里是創(chuàng)新互聯(lián),小編將為大家推送更多相關(guān)知識點(diǎn)的文章,歡迎關(guān)注!
網(wǎng)站題目:Standaloneclient模式下怎么提交spark程序
本文地址:http://jinyejixie.com/article48/pspjhp.html
成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供做網(wǎng)站、品牌網(wǎng)站建設(shè)、外貿(mào)建站、商城網(wǎng)站、搜索引擎優(yōu)化、網(wǎng)站排名
聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場,如需處理請聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來源: 創(chuàng)新互聯(lián)