java - Spark Streaming App submit through code -
i trying submit spark streaming application through code
sparkconf sparkconf= new sparkconf(); javastreamingcontext jssc = new javastreamingcontext(master, appname, new duration(60*1000), sparkhome, sparkjar);
have given obsolute path dor sparkjar , sparkhome master spark://xyz:7077
i tried submitting batch processing in same way , worked not working streaming following error..
14/11/26 17:42:25 info spark.httpfileserver: http file server directory /var/folders/3j/9hjkw0890sx_qg9yvzlvg64cf5626b/t/spark-cd7b30cd-cf95-4e52-8eb4-1c1dccc2d58f 14/11/26 17:42:25 info spark.httpserver: starting http server 14/11/26 17:42:25 info server.server: jetty-8.1.14.v20131031 14/11/26 17:42:25 info server.abstractconnector: started socketconnector@0.0.0.0:50016 14/11/26 17:42:25 info server.server: jetty-8.1.14.v20131031 14/11/26 17:42:25 info server.abstractconnector: started selectchannelconnector@0.0.0.0:4040 14/11/26 17:42:25 info ui.sparkui: started sparkui @ http://xxx.xx.xxx.xx:4040 14/11/26 17:42:30 info spark.sparkcontext: added jar /volumes/official/workspace/zbi/target/zbi-0.0.1-snapshot-jar-with-dependencies.jar @ http://xxx.xx.xxx.xx:50016/jars/zbi-0.0.1-snapshot-jar-with-dependencies.jar timestamp 1417003949988 exception in thread "main" java.lang.noclassdeffounderror: **org/apache/spark/ui/sparkuitab** @ java.lang.classloader.defineclass1(native method) @ java.lang.classloader.defineclass(classloader.java:800) @ java.security.secureclassloader.defineclass(secureclassloader.java:142) @ java.net.urlclassloader.defineclass(urlclassloader.java:449) @ java.net.urlclassloader.access$100(urlclassloader.java:71) @ java.net.urlclassloader$1.run(urlclassloader.java:361) @ java.net.urlclassloader$1.run(urlclassloader.java:355) @ java.security.accesscontroller.doprivileged(native method) @ java.net.urlclassloader.findclass(urlclassloader.java:354) @ java.lang.classloader.loadclass(classloader.java:425) @ sun.misc.launcher$appclassloader.loadclass(launcher.java:308)
i using maven, following pom.xml
<?xml version="1.0" encoding="utf-8"?> <project> <modelversion>4.0.0</modelversion> <groupid>betatesttool</groupid> <artifactid>testtool</artifactid> <packaging>jar</packaging> <version>0.0.1-snapshot</version> <description></description> <build> <plugins> <plugin> <artifactid>maven-compiler-plugin</artifactid> <version>3.1</version> <configuration> <source>1.5</source> <target>1.5</target> </configuration> </plugin> <plugin> <artifactid>maven-war-plugin</artifactid> <version>2.0.1</version> </plugin> <plugin> <artifactid>maven-assembly-plugin</artifactid> <version>2.3</version> <configuration> <descriptorrefs> <descriptorref>jar-with-dependencies</descriptorref> </descriptorrefs> </configuration> <executions> <execution> <id>make-assembly</id> <!-- used inheritance merges --> <phase>package</phase> <!-- bind packaging phase --> <goals> <goal>single</goal> </goals> </execution> </executions> </plugin> </plugins> </build> <dependencies> **<dependency> <groupid>javax.servlet</groupid> <artifactid>servlet-api</artifactid> <version>2.5</version> </dependency>** <dependency> <!-- spark dependency --> <groupid>org.apache.spark</groupid> <artifactid>spark-core_2.10</artifactid> <version>1.0.2</version> </dependency> <dependency> <!-- spark dependency --> <groupid>org.apache.spark</groupid> <artifactid>spark-sql_2.10</artifactid> <version>1.0.2</version> </dependency> <!-- <dependency> spark dependency <groupid>org.apache.spark</groupid> <artifactid>spark-hive_2.10</artifactid> <version>1.0.2</version> </dependency> --> <dependency> <!-- spark dependency --> <groupid>org.apache.spark</groupid> <artifactid>spark-streaming-kafka_2.10</artifactid> <version>1.1.0</version> </dependency> <dependency> <groupid>org.slf4j</groupid> <artifactid>slf4j-log4j12</artifactid> <version>1.7.5</version> </dependency> </dependencies> </project>
i got exception following exception
14/11/27 10:43:13 info spark.httpfileserver: http file server directory /var/folders/3j/9hjkw0890sx_qg9yvzlvg64cf5626b/t/spark-b162a8c1-0d77-48db-b559-2b242449db3e 14/11/27 10:43:13 info spark.httpserver: starting http server 14/11/27 10:43:13 info server.server: jetty-8.1.14.v20131031 14/11/27 10:43:13 info server.abstractconnector: started socketconnector@0.0.0.0:62675 exception in thread "main" java.lang.securityexception: class "javax.servlet.filterregistration"'s signer information not match signer information of other classes in same package @ java.lang.classloader.checkcerts(classloader.java:952) @ java.lang.classloader.predefineclass(classloader.java:666) @ java.lang.classloader.defineclass(classloader.java:794) @ java.security.secureclassloader.defineclass(secureclassloader.java:142) @ java.net.urlclassloader.defineclass(urlclassloader.java:449) @ java.net.urlclassloader.access$100(urlclassloader.java:71) @ java.net.urlclassloader$1.run(urlclassloader.java:361) @ java.net.urlclassloader$1.run(urlclassloader.java:355) @ java.security.accesscontroller.doprivileged(native method) @ java.net.urlclassloader.findclass(urlclassloader.java:354)
then commented javax.servlet
dependancy.. after got first mentioned error.. please suggest how exclude depandancy.. tried giving scope compile , provided didn't work..
any appreciated
my pom tree follows
--- maven-dependency-plugin:2.8:tree (default-cli) @ zbi --- [info] betabi:zbi:jar:0.0.1-snapshot [info] \- org.apache.spark:spark-core_2.10:jar:1.0.2:compile [info] \- org.apache.hadoop:hadoop-client:jar:1.0.4:compile [info] \- org.apache.hadoop:hadoop-core:jar:1.0.4:compile [info] \- commons-configuration:commons-configuration:jar:1.6:compile [info] \- commons-collections:commons-collections:jar:3.2.1:compile
how exclude javax.servlet in hadoop dependance in core spark?
it seems spark streaming dependency missing in pom.xml.
<dependency> <!-- spark streaming dependency --> <groupid>org.apache.spark</groupid> <artifactid>spark-streaming_2.10</artifactid> <version>1.0.2</version> </dependency>
Comments
Post a Comment