In Java, javaSparkContext would have to be declared final in order for
it to be accessed inside an inner class like this. But this would
still not work as the context is not serializable. You  should rewrite
this so you are not attempting to use the Spark context inside  an
RDD.

On Thu, Oct 23, 2014 at 8:46 AM, Localhost shell
<universal.localh...@gmail.com> wrote:
> Hey All,
>
> I am unable to access objects declared and initialized outside the call()
> method of JavaRDD.
>
> In the below code snippet, call() method makes a fetch call to C* but since
> javaSparkContext is defined outside the call method scope so compiler give a
> compilation error.
>
> stringRdd.foreach(new VoidFunction<String>() {
>                 @Override
>                 public void call(String str) throws Exception {
>                     JavaRDD<String> vals =
> javaFunctions(javaSparkContext).cassandraTable("schema", "table",
> String.class)
>                             .select("val");
>                 }
>             });
>
> In other languages I have used closure to do this but not able to achieve
> the same here.
>
> Can someone suggest how to achieve this in the current code context?
>
>
> --Unilocal
>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to