hadoop - How to get spark job status from program? -


i aware hadoop rest api provides access job status via program.

similarly there way spark job status in program?

it not similar rest api, can track status of jobs inside application registering sparklistener sparkcontext.addsparklistener. goes this:

sc.addsparklistener(new sparklistener {   override def onstagecompleted(event: sparklistenerstagecompleted) = {     if (event.stageinfo.stageid == mystage) {       println(s"stage $mystage done.")     }   } }) 

Comments

Popular posts from this blog

javascript - Any ideas when Firefox is likely to implement lengthAdjust and textLength? -

matlab - "Contour not rendered for non-finite ZData" -

delphi - Indy UDP Read Contents of Adata -