Hi

I have added spark assembly jar to SPARK CLASSPATH

>>> print os.environ['SPARK_CLASSPATH']
D:\sw\spark-streaming-kafka-assembly_2.10-1.3.0.jar


Now  I am facing below issue with a test topic

>>> ssc = StreamingContext(sc, 2)
>>> kvs =
KafkaUtils.createDirectStream(ssc,['spark'],{"metadata.broker.list":'l
ocalhost:9092'})
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File
"D:\sw\spark-1.5.0-bin-hadoop2.6\spark-1.5.0-bin-hadoop2.6\python\pyspark
\streaming\kafka.py", line 126, in createDirectStream
    jstream = helper.createDirectStream(ssc._jssc, kafkaParams,
set(topics), jfr
omOffsets)
  File
"D:\sw\spark-1.5.0-bin-hadoop2.6\spark-1.5.0-bin-hadoop2.6\python\lib\py4
j-0.8.2.1-src.zip\py4j\java_gateway.py", line 538, in __call__
  File
"D:\sw\spark-1.5.0-bin-hadoop2.6\spark-1.5.0-bin-hadoop2.6\python\pyspark
\sql\utils.py", line 36, in deco
    return f(*a, **kw)
  File
"D:\sw\spark-1.5.0-bin-hadoop2.6\spark-1.5.0-bin-hadoop2.6\python\lib\py4
j-0.8.2.1-src.zip\py4j\protocol.py", line 304, in get_return_value
py4j.protocol.Py4JError: An error occurred while calling
o22.createDirectStream.
 Trace:
py4j.Py4JException: Method createDirectStream([class
org.apache.spark.streaming.
api.java.JavaStreamingContext, class java.util.HashMap, class
java.util.HashSet,
 class java.util.HashMap]) does not exist
        at
py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:333)

        at
py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:342)

        at py4j.Gateway.invoke(Gateway.java:252)
        at
py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.GatewayConnection.run(GatewayConnection.java:207)
        at java.lang.Thread.run(Unknown Source)


>>>

Am I doing something wrong?


-- 
Best Regards,
Ayan Guha

Reply via email to