hadoop - How to get spark job status from program? -
i aware hadoop rest api provides access job status via program.
similarly there way spark job status in program?
it not similar rest api, can track status of jobs inside application registering sparklistener
sparkcontext.addsparklistener
. goes this:
sc.addsparklistener(new sparklistener { override def onstagecompleted(event: sparklistenerstagecompleted) = { if (event.stageinfo.stageid == mystage) { println(s"stage $mystage done.") } } })
Comments
Post a Comment