Dankeschön !

On Tue, Apr 15, 2014 at 11:29 AM, Aaron Davidson <ilike...@gmail.com> wrote:

> This is probably related to the Scala bug that :cp does not work:
> https://issues.scala-lang.org/browse/SI-6502
>
>
> On Tue, Apr 15, 2014 at 11:21 AM, Walrus theCat <walrusthe...@gmail.com>wrote:
>
>> Actually altering the classpath in the REPL causes the provided
>> SparkContext to disappear:
>>
>> scala> sc.parallelize(List(1,2,3))
>> res0: spark.RDD[Int] = ParallelCollectionRDD[0] at parallelize at
>> <console>:13
>>
>> scala> :cp /root
>> Added '/root'.  Your new classpath is:
>>
>> ":/root/jars/aspectjrt.jar:/root/jars/aspectjweaver.jar:/root/jars/aws-java-sdk-1.4.5.jar:/root/jars/aws-java-sdk-1.4.5-javadoc.jar:/root/jars/aws-java-sdk-1.4.5-sources.jar:/root/jars/aws-java-sdk-flow-build-tools-1.4.5.jar:/root/jars/commons-codec-1.3.jar:/root/jars/commons-logging-1.1.1.jar:/root/jars/freemarker-2.3.18.jar:/root/jars/httpclient-4.1.1.jar:/root/jars/httpcore-4.1.jar:/root/jars/jackson-core-asl-1.8.7.jar:/root/jars/mail-1.4.3.jar:/root/jars/spring-beans-3.0.7.jar:/root/jars/spring-context-3.0.7.jar:/root/jars/spring-core-3.0.7.jar:/root/jars/stax-1.2.0.jar:/root/jars/stax-api-1.0.1.jar:/root/spark/conf:/root/spark/core/target/scala-2.9.3/classes:/root/spark/core/src/main/resources:/root/spark/repl/target/scala-2.9.3/classes:/root/spark/examples/target/scala-2.9.3/classes:/root/spark/streaming/target/scala-2.9.3/classes:/root/spark/streaming/lib/org/apache/kafka/kafka/0.7.2-spark/*:/root/spark/lib_managed/jars/*:/root/spark/lib_managed/bundles/*:/root/spark/repl/lib/*:/root/spark/bagel/target/scala-2.9.3/classes:/root/spark/python/lib/py4j0.7.jar:/root"
>> 14/04/15 18:19:37 INFO server.Server: jetty-7.6.8.v20121106
>> 14/04/15 18:19:37 INFO server.AbstractConnector: Started
>> SocketConnector@0.0.0.0:48978
>> Replaying: sc.parallelize(List(1,2,3))
>> <console>:8: error: not found: value sc
>>        sc.parallelize(List(1,2,3))
>>
>>
>>
>> On Mon, Apr 14, 2014 at 7:51 PM, Walrus theCat <walrusthe...@gmail.com>wrote:
>>
>>> Nevermind -- I'm like 90% sure the problem is that I'm importing stuff
>>> that declares a SparkContext as sc.  If it's not, I'll report back.
>>>
>>>
>>> On Mon, Apr 14, 2014 at 2:55 PM, Walrus theCat 
>>> <walrusthe...@gmail.com>wrote:
>>>
>>>> Hi,
>>>>
>>>> Using the spark-shell, I can't sc.parallelize to get an RDD.
>>>>
>>>> Looks like a bug.
>>>>
>>>> scala> sc.parallelize(Array("a","s","d"))
>>>> java.lang.NullPointerException
>>>>     at <init>(<console>:17)
>>>>     at <init>(<console>:22)
>>>>     at <init>(<console>:24)
>>>>     at <init>(<console>:26)
>>>>     at <init>(<console>:28)
>>>>     at <init>(<console>:30)
>>>>     at <init>(<console>:32)
>>>>     at <init>(<console>:34)
>>>>     at <init>(<console>:36)
>>>>     at .<init>(<console>:40)
>>>>     at .<clinit>(<console>)
>>>>     at .<init>(<console>:11)
>>>>     at .<clinit>(<console>)
>>>>     at $export(<console>)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>     at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>     at spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:629)
>>>>     at
>>>> spark.repl.SparkIMain$Request$$anonfun$10.apply(SparkIMain.scala:890)
>>>>     at
>>>> scala.tools.nsc.interpreter.Line$$anonfun$1.apply$mcV$sp(Line.scala:43)
>>>>     at scala.tools.nsc.io.package$$anon$2.run(package.scala:25)
>>>>     at java.lang.Thread.run(Thread.java:744)
>>>>
>>>
>>>
>>
>

Reply via email to