Re: Spark Context not getting initialized in local mode

2016-01-08 Thread Dean Wampler
ClassNotFoundException usually means one of a few problems:

1. Your app assembly is missing the jar files with those classes.
2. You mixed jar files from imcompatible versions in your assembly.
3. You built with one version of Spark and deployed to another.


Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
 (O'Reilly)
Typesafe 
@deanwampler 
http://polyglotprogramming.com

On Fri, Jan 8, 2016 at 1:24 AM, Rahul Kumar 
wrote:

>
>
>
>
> Hi all,
> I am trying to start solr with a custom plugin which uses spark library. I
> am trying to initialize sparkcontext in local mode. I have made a fat jar
> for this plugin using maven shade and put it in the lib directory. *While
> starting solr it is not able to initialize sparkcontext.* It says class
> not found exception for AkkaRpcEnvFactory. Can anyone please help.
>
> *It gives the following error:*
>
> 3870 [coreLoadExecutor-4-thread-1] ERROR org.apache.spark.SparkContext  – 
> Error initializing SparkContext.
> java.lang.ClassNotFoundException:org.apache.spark.rpc.akka.AkkaRpcEnvFactory
>
> *Here is the detailed error*
>
> java -jar start.jar0[main] INFO  org.eclipse.jetty.server.Server  – 
> jetty-8.1.10.v2013031227   [main] INFO  
> org.eclipse.jetty.deploy.providers.ScanningAppProvider  – Deployment monitor 
> /home/rahul/solr-4.7.2/example/contexts at interval 040   [main] INFO  
> org.eclipse.jetty.deploy.DeploymentManager  – Deployable added: 
> /home/rahul/solr-4.7.2/example/contexts/solr-jetty-context.xml1095 [main] 
> INFO  org.eclipse.jetty.webapp.StandardDescriptorProcessor  – NO JSP Support 
> for /solr, did not find org.apache.jasper.servlet.JspServlet1155 [main] INFO  
> org.apache.solr.servlet.SolrDispatchFilter  – SolrDispatchFilter.init()1189 
> [main] INFO  org.apache.solr.core.SolrResourceLoader  – JNDI not configured 
> for solr (NoInitialContextEx)1190 [main] INFO  
> org.apache.solr.core.SolrResourceLoader  – solr home defaulted to 'solr/' 
> (could not find system property or JNDI)1190 [main] INFO  
> org.apache.solr.core.SolrResourceLoader  – new SolrResourceLoader for 
> directory: 'solr/'1280 [main] INFO  org.apache.solr.core.ConfigSolr  – 
> Loading container configuration from 
> /home/rahul/solr-4.7.2/example/solr/solr.xml1458 [main] INFO  
> org.apache.solr.core.CoresLocator  – Config-defined core root directory: 
> /home/rahul/solr-4.7.2/example/solr1465 [main] INFO  
> org.apache.solr.core.CoreContainer  – New CoreContainer 
> 602710225...
> 3870 [coreLoadExecutor-4-thread-1] ERROR org.apache.spark.SparkContext  – 
> Error initializing SparkContext.
> java.lang.ClassNotFoundException: org.apache.spark.rpc.akka.AkkaRpcEnvFactory
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> at 
> org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:430)
> at 
> org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:383)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:274)
> at org.apache.spark.util.Utils$.classForName(Utils.scala:173)
> at org.apache.spark.rpc.RpcEnv$.getRpcEnvFactory(RpcEnv.scala:42)
> at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:53)
> at org.apache.spark.SparkEnv$.create(SparkEnv.scala:252)
> at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
> at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:276)
> at org.apache.spark.SparkContext.(SparkContext.scala:441)
> at 
> org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
> at 
> com.snapdeal.search.spark.SparkLoadModel.loadModel(SparkLoadModel.java:11)
> at 
> com.snapdeal.search.valuesource.parser.RankingModelValueSourceParser.init(RankingModelValueSourceParser.java:29)
> at org.apache.solr.core.SolrCore.createInitInstance(SolrCore.java:591)
> at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:2191)
> at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:2185)
> at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:2218)
> at 
> org.apache.solr.core.SolrCore.initValueSourceParsers(SolrCore.java:2130)
> at org.apache.solr.core.SolrCore.(SolrCore.java:765)
> at org.apache.solr.core.SolrCore.(SolrCore.java:630)
> at 
> org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:562)
> at org.apache.solr.core.CoreContainer.create(CoreContainer.java:597)
> at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:258)
> at 

Spark Context not getting initialized in local mode

2016-01-07 Thread Rahul Kumar
Hi all,
I am trying to start solr with a custom plugin which uses spark library. I
am trying to initialize sparkcontext in local mode. I have made a fat jar
for this plugin using maven shade and put it in the lib directory. *While
starting solr it is not able to initialize sparkcontext.* It says class not
found exception for AkkaRpcEnvFactory. Can anyone please help.

*It gives the following error:*

3870 [coreLoadExecutor-4-thread-1] ERROR org.apache.spark.SparkContext
 – Error initializing SparkContext.
java.lang.ClassNotFoundException:org.apache.spark.rpc.akka.AkkaRpcEnvFactory

*Here is the detailed error*

java -jar start.jar0[main] INFO  org.eclipse.jetty.server.Server
– jetty-8.1.10.v2013031227   [main] INFO
org.eclipse.jetty.deploy.providers.ScanningAppProvider  – Deployment
monitor /home/rahul/solr-4.7.2/example/contexts at interval 040
[main] INFO  org.eclipse.jetty.deploy.DeploymentManager  – Deployable
added: /home/rahul/solr-4.7.2/example/contexts/solr-jetty-context.xml1095
[main] INFO  org.eclipse.jetty.webapp.StandardDescriptorProcessor  –
NO JSP Support for /solr, did not find
org.apache.jasper.servlet.JspServlet1155 [main] INFO
org.apache.solr.servlet.SolrDispatchFilter  –
SolrDispatchFilter.init()1189 [main] INFO
org.apache.solr.core.SolrResourceLoader  – JNDI not configured for
solr (NoInitialContextEx)1190 [main] INFO
org.apache.solr.core.SolrResourceLoader  – solr home defaulted to
'solr/' (could not find system property or JNDI)1190 [main] INFO
org.apache.solr.core.SolrResourceLoader  – new SolrResourceLoader for
directory: 'solr/'1280 [main] INFO  org.apache.solr.core.ConfigSolr  –
Loading container configuration from
/home/rahul/solr-4.7.2/example/solr/solr.xml1458 [main] INFO
org.apache.solr.core.CoresLocator  – Config-defined core root
directory: /home/rahul/solr-4.7.2/example/solr1465 [main] INFO
org.apache.solr.core.CoreContainer  – New CoreContainer
602710225...
3870 [coreLoadExecutor-4-thread-1] ERROR org.apache.spark.SparkContext
 – Error initializing SparkContext.
java.lang.ClassNotFoundException: org.apache.spark.rpc.akka.AkkaRpcEnvFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at 
org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:430)
at 
org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:383)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.apache.spark.util.Utils$.classForName(Utils.scala:173)
at org.apache.spark.rpc.RpcEnv$.getRpcEnvFactory(RpcEnv.scala:42)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:53)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:252)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:193)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:276)
at org.apache.spark.SparkContext.(SparkContext.scala:441)
at 
org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:61)
at 
com.snapdeal.search.spark.SparkLoadModel.loadModel(SparkLoadModel.java:11)
at 
com.snapdeal.search.valuesource.parser.RankingModelValueSourceParser.init(RankingModelValueSourceParser.java:29)
at org.apache.solr.core.SolrCore.createInitInstance(SolrCore.java:591)
at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:2191)
at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:2185)
at org.apache.solr.core.SolrCore.initPlugins(SolrCore.java:2218)
at org.apache.solr.core.SolrCore.initValueSourceParsers(SolrCore.java:2130)
at org.apache.solr.core.SolrCore.(SolrCore.java:765)
at org.apache.solr.core.SolrCore.(SolrCore.java:630)
at 
org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:562)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:597)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:258)
at org.apache.solr.core.CoreContainer$1.call(CoreContainer.java:250)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)3880
[coreLoadExecutor-4-thread-1] INFO  org.apache.spark.SparkContext  –
Successfully stopped SparkContext





Rahul Kumar
*Software Engineer- I (Search Snapdeal)*

*M*: +91 9023542950*EXT: *14226
362-363, ASF CENTRE , UDYOG VIHAR , PHASE - IV , GURGAON 122 016 ,