[jira] [Closed] (SPARK-24086) Exception while executing spark streaming examples

2018-04-27 Thread Chandra Hasan (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-24086?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chandra Hasan closed SPARK-24086.
-

After adding necessary dependencies, its working fine

> Exception while executing spark streaming examples
> --
>
> Key: SPARK-24086
> URL: https://issues.apache.org/jira/browse/SPARK-24086
> Project: Spark
>  Issue Type: Bug
>  Components: Examples
>Affects Versions: 2.3.0
>Reporter: Chandra Hasan
>Priority: Major
>
> After running mvn clean package, I tried to execute one of the spark example 
> program JavaDirectKafkaWordCount.java but throws following exeception.
> {code:java}
> [cloud-user@server-2 examples]$ run-example 
> streaming.JavaDirectKafkaWordCount 192.168.0.4:9092 msu
> 2018-04-25 09:39:22 WARN NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2018-04-25 09:39:22 INFO SparkContext:54 - Running Spark version 2.3.0
> 2018-04-25 09:39:22 INFO SparkContext:54 - Submitted application: 
> JavaDirectKafkaWordCount
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(cloud-user); 
> groups with view permissions: Set(); users with modify permissions: 
> Set(cloud-user); groups with modify permissions: Set()
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 
> 'sparkDriver' on port 59333.
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering MapOutputTracker
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering BlockManagerMaster
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2018-04-25 09:39:23 INFO DiskBlockManager:54 - Created local directory at 
> /tmp/blockmgr-6fc11fc1-f638-42ea-a9df-dc01fb81b7b6
> 2018-04-25 09:39:23 INFO MemoryStore:54 - MemoryStore started with capacity 
> 366.3 MB
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering OutputCommitCoordinator
> 2018-04-25 09:39:23 INFO log:192 - Logging initialized @1825ms
> 2018-04-25 09:39:23 INFO Server:346 - jetty-9.3.z-SNAPSHOT
> 2018-04-25 09:39:23 INFO Server:414 - Started @1900ms
> 2018-04-25 09:39:23 INFO AbstractConnector:278 - Started 
> ServerConnector@6813a331{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 'SparkUI' on 
> port 4040.
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@4f7c0be3{/jobs,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@4cfbaf4{/jobs/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@58faa93b{/jobs/job,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@127d7908{/jobs/job/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@6b9c69a9{/stages,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@6622a690{/stages/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@30b9eadd{/stages/stage,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@3249a1ce{/stages/stage/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@4dd94a58{/stages/pool,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@2f4919b0{/stages/pool/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@a8a8b75{/storage,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@75b21c3b{/storage/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@72be135f{/storage/rdd,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@155d1021{/storage/rdd/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> 

[jira] [Comment Edited] (SPARK-24086) Exception while executing spark streaming examples

2018-04-27 Thread Chandra Hasan (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-24086?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16456003#comment-16456003
 ] 

Chandra Hasan edited comment on SPARK-24086 at 4/27/18 6:47 AM:


[~hyukjin.kwon] Thanks mate, i included necessary dependencies while executing 
and its working now.
 If someone is facing same issue here is the solution
{code:java}
spark-submit --jars 
kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar
 --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount 
target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar  
{code}
 

Also [~hyukjin.kwon] I would like to inform that the consumer properties 
mentioned in the example file JavaDirectKafkaWordCount example isn't updated 
which throws configuration missing error and i need to rewrite the code as below

{{}}
{code:java}
kafkaParams.put("bootstrap.servers", brokers);
kafkaParams.put("key.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("value.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("group.id", "");{code}
 

{{What do you say, Is it fine or need to open a bug for this?}}

 


was (Author: hasan4791):
[~hyukjin.kwon] Thanks mate, i included necessary dependencies while executing 
and its working now.
 If someone is facing same issue here is the solution
{code:java}
spark-submit --jars 
kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar
 --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount 
target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar  
{code}
 

Also [~hyukjin.kwon] I would like to inform that the consumer properties 
mentioned in the example file JavaDirectKafkaWordCount example isn't updated 
which throws configuration missing error and i need to rewrite the code as below

{{}}
{code:java}
kafkaParams.put("bootstrap.servers", brokers);
kafkaParams.put("key.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("value.deserializer", 
org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("group.id", "");{code}
 

{{What do you say, Is it fine or need to open a bug for this?}}

 

> Exception while executing spark streaming examples
> --
>
> Key: SPARK-24086
> URL: https://issues.apache.org/jira/browse/SPARK-24086
> Project: Spark
>  Issue Type: Bug
>  Components: Examples
>Affects Versions: 2.3.0
>Reporter: Chandra Hasan
>Priority: Major
>
> After running mvn clean package, I tried to execute one of the spark example 
> program JavaDirectKafkaWordCount.java but throws following exeception.
> {code:java}
> [cloud-user@server-2 examples]$ run-example 
> streaming.JavaDirectKafkaWordCount 192.168.0.4:9092 msu
> 2018-04-25 09:39:22 WARN NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2018-04-25 09:39:22 INFO SparkContext:54 - Running Spark version 2.3.0
> 2018-04-25 09:39:22 INFO SparkContext:54 - Submitted application: 
> JavaDirectKafkaWordCount
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(cloud-user); 
> groups with view permissions: Set(); users with modify permissions: 
> Set(cloud-user); groups with modify permissions: Set()
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 
> 'sparkDriver' on port 59333.
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering MapOutputTracker
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering BlockManagerMaster
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2018-04-25 09:39:23 INFO DiskBlockManager:54 - Created local directory at 
> /tmp/blockmgr-6fc11fc1-f638-42ea-a9df-dc01fb81b7b6
> 2018-04-25 09:39:23 INFO MemoryStore:54 - MemoryStore started with capacity 
> 366.3 MB
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering OutputCommitCoordinator
> 2018-04-25 09:39:23 INFO log:192 - Logging initialized @1825ms
> 2018-04-25 09:39:23 INFO Server:346 - jetty-9.3.z-SNAPSHOT
> 2018-04-25 09:39:23 INFO Server:414 - Started @1900ms
> 2018-04-25 09:39:23 INFO AbstractConnector:278 - 

[jira] [Comment Edited] (SPARK-24086) Exception while executing spark streaming examples

2018-04-27 Thread Chandra Hasan (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-24086?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16456003#comment-16456003
 ] 

Chandra Hasan edited comment on SPARK-24086 at 4/27/18 6:47 AM:


[~hyukjin.kwon] Thanks mate, i included necessary dependencies while executing 
and its working now.
 If someone is facing same issue here is the solution
{code:java}
spark-submit --jars 
kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar
 --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount 
target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar  
{code}
 

Also [~hyukjin.kwon] I would like to inform that the consumer properties 
mentioned in the example file JavaDirectKafkaWordCount example isn't updated 
which throws configuration missing error and i need to rewrite the code as below
{code:java}
kafkaParams.put("bootstrap.servers", brokers);
kafkaParams.put("key.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("value.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("group.id", "");{code}
 

{{What do you say, Is it fine or need to open a bug for this?}}

 


was (Author: hasan4791):
[~hyukjin.kwon] Thanks mate, i included necessary dependencies while executing 
and its working now.
 If someone is facing same issue here is the solution
{code:java}
spark-submit --jars 
kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar
 --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount 
target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar  
{code}
 

Also [~hyukjin.kwon] I would like to inform that the consumer properties 
mentioned in the example file JavaDirectKafkaWordCount example isn't updated 
which throws configuration missing error and i need to rewrite the code as below

{{}}
{code:java}
kafkaParams.put("bootstrap.servers", brokers);
kafkaParams.put("key.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("value.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("group.id", "");{code}
 

{{What do you say, Is it fine or need to open a bug for this?}}

 

> Exception while executing spark streaming examples
> --
>
> Key: SPARK-24086
> URL: https://issues.apache.org/jira/browse/SPARK-24086
> Project: Spark
>  Issue Type: Bug
>  Components: Examples
>Affects Versions: 2.3.0
>Reporter: Chandra Hasan
>Priority: Major
>
> After running mvn clean package, I tried to execute one of the spark example 
> program JavaDirectKafkaWordCount.java but throws following exeception.
> {code:java}
> [cloud-user@server-2 examples]$ run-example 
> streaming.JavaDirectKafkaWordCount 192.168.0.4:9092 msu
> 2018-04-25 09:39:22 WARN NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2018-04-25 09:39:22 INFO SparkContext:54 - Running Spark version 2.3.0
> 2018-04-25 09:39:22 INFO SparkContext:54 - Submitted application: 
> JavaDirectKafkaWordCount
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(cloud-user); 
> groups with view permissions: Set(); users with modify permissions: 
> Set(cloud-user); groups with modify permissions: Set()
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 
> 'sparkDriver' on port 59333.
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering MapOutputTracker
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering BlockManagerMaster
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2018-04-25 09:39:23 INFO DiskBlockManager:54 - Created local directory at 
> /tmp/blockmgr-6fc11fc1-f638-42ea-a9df-dc01fb81b7b6
> 2018-04-25 09:39:23 INFO MemoryStore:54 - MemoryStore started with capacity 
> 366.3 MB
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering OutputCommitCoordinator
> 2018-04-25 09:39:23 INFO log:192 - Logging initialized @1825ms
> 2018-04-25 09:39:23 INFO Server:346 - jetty-9.3.z-SNAPSHOT
> 2018-04-25 09:39:23 INFO Server:414 - Started @1900ms
> 2018-04-25 09:39:23 INFO AbstractConnector:278 - 

[jira] [Comment Edited] (SPARK-24086) Exception while executing spark streaming examples

2018-04-27 Thread Chandra Hasan (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-24086?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16456003#comment-16456003
 ] 

Chandra Hasan edited comment on SPARK-24086 at 4/27/18 6:46 AM:


[~hyukjin.kwon] Thanks mate, i included necessary dependencies while executing 
and its working now.
 If someone is facing same issue here is the solution
{code:java}
spark-submit --jars 
kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar
 --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount 
target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar  
{code}
 

Also [~hyukjin.kwon] I would like to inform that the consumer properties 
mentioned in the example file JavaDirectKafkaWordCount example isn't updated 
which throws configuration missing error and i need to rewrite the code as below

{{}}
{code:java}
kafkaParams.put("bootstrap.servers", brokers);
kafkaParams.put("key.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("value.deserializer", 
org.apache.kafka.common.serialization.StringDeserializer");
kafkaParams.put("group.id", "");{code}
 

{{What do you say, Is it fine or need to open a bug for this?}}

 


was (Author: hasan4791):
[~hyukjin.kwon] Thanks mate, i included necessary dependencies while executing 
and its working now.
If someone is facing same issue here is the solution
{code:java}
spark-submit --jars 
kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar
 --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount 
target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar  
{code}
 

Also [~hyukjin.kwon] I would like to inform that the consumer properties 
mentioned in the example file JavaDirectKafkaWordCount example isn't updated 
which throws configuration missing error and i need to rewrite the code as below

{{}}
{code:java}
kafkaParams.put("bootstrap.servers", brokers); 
kafkaParams.put("key.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer"); 
kafkaParams.put("value.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer"); 
kafkaParams.put("group.id", "");{code}
 

{{What do you say, Is it fine or need to open a bug for this?}}

 

> Exception while executing spark streaming examples
> --
>
> Key: SPARK-24086
> URL: https://issues.apache.org/jira/browse/SPARK-24086
> Project: Spark
>  Issue Type: Bug
>  Components: Examples
>Affects Versions: 2.3.0
>Reporter: Chandra Hasan
>Priority: Major
>
> After running mvn clean package, I tried to execute one of the spark example 
> program JavaDirectKafkaWordCount.java but throws following exeception.
> {code:java}
> [cloud-user@server-2 examples]$ run-example 
> streaming.JavaDirectKafkaWordCount 192.168.0.4:9092 msu
> 2018-04-25 09:39:22 WARN NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2018-04-25 09:39:22 INFO SparkContext:54 - Running Spark version 2.3.0
> 2018-04-25 09:39:22 INFO SparkContext:54 - Submitted application: 
> JavaDirectKafkaWordCount
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(cloud-user); 
> groups with view permissions: Set(); users with modify permissions: 
> Set(cloud-user); groups with modify permissions: Set()
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 
> 'sparkDriver' on port 59333.
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering MapOutputTracker
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering BlockManagerMaster
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2018-04-25 09:39:23 INFO DiskBlockManager:54 - Created local directory at 
> /tmp/blockmgr-6fc11fc1-f638-42ea-a9df-dc01fb81b7b6
> 2018-04-25 09:39:23 INFO MemoryStore:54 - MemoryStore started with capacity 
> 366.3 MB
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering OutputCommitCoordinator
> 2018-04-25 09:39:23 INFO log:192 - Logging initialized @1825ms
> 2018-04-25 09:39:23 INFO Server:346 - jetty-9.3.z-SNAPSHOT
> 2018-04-25 09:39:23 INFO Server:414 - Started @1900ms
> 2018-04-25 09:39:23 INFO AbstractConnector:278 

[jira] [Commented] (SPARK-24086) Exception while executing spark streaming examples

2018-04-27 Thread Chandra Hasan (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-24086?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16456003#comment-16456003
 ] 

Chandra Hasan commented on SPARK-24086:
---

[~hyukjin.kwon] Thanks mate, i included necessary dependencies while executing 
and its working now.
If someone is facing same issue here is the solution
{code:java}
spark-submit --jars 
kafka-clients-1.1.0.jar,spark-streaming_2.11-2.3.0.jar,spark-streaming-kafka-0-10_2.11-2.3.0.jar
 --class org.apache.spark.examples.streaming.JavaDirectKafkaWordCount 
target/original-spark-examples_2.11-2.4.0-SNAPSHOT.jar  
{code}
 

Also [~hyukjin.kwon] I would like to inform that the consumer properties 
mentioned in the example file JavaDirectKafkaWordCount example isn't updated 
which throws configuration missing error and i need to rewrite the code as below

{{}}
{code:java}
kafkaParams.put("bootstrap.servers", brokers); 
kafkaParams.put("key.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer"); 
kafkaParams.put("value.deserializer", 
"org.apache.kafka.common.serialization.StringDeserializer"); 
kafkaParams.put("group.id", "");{code}
 

{{What do you say, Is it fine or need to open a bug for this?}}

 

> Exception while executing spark streaming examples
> --
>
> Key: SPARK-24086
> URL: https://issues.apache.org/jira/browse/SPARK-24086
> Project: Spark
>  Issue Type: Bug
>  Components: Examples
>Affects Versions: 2.3.0
>Reporter: Chandra Hasan
>Priority: Major
>
> After running mvn clean package, I tried to execute one of the spark example 
> program JavaDirectKafkaWordCount.java but throws following exeception.
> {code:java}
> [cloud-user@server-2 examples]$ run-example 
> streaming.JavaDirectKafkaWordCount 192.168.0.4:9092 msu
> 2018-04-25 09:39:22 WARN NativeCodeLoader:62 - Unable to load native-hadoop 
> library for your platform... using builtin-java classes where applicable
> 2018-04-25 09:39:22 INFO SparkContext:54 - Running Spark version 2.3.0
> 2018-04-25 09:39:22 INFO SparkContext:54 - Submitted application: 
> JavaDirectKafkaWordCount
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls to: 
> cloud-user
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls groups to:
> 2018-04-25 09:39:22 INFO SecurityManager:54 - SecurityManager: authentication 
> disabled; ui acls disabled; users with view permissions: Set(cloud-user); 
> groups with view permissions: Set(); users with modify permissions: 
> Set(cloud-user); groups with modify permissions: Set()
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 
> 'sparkDriver' on port 59333.
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering MapOutputTracker
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering BlockManagerMaster
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - Using 
> org.apache.spark.storage.DefaultTopologyMapper for getting topology 
> information
> 2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - 
> BlockManagerMasterEndpoint up
> 2018-04-25 09:39:23 INFO DiskBlockManager:54 - Created local directory at 
> /tmp/blockmgr-6fc11fc1-f638-42ea-a9df-dc01fb81b7b6
> 2018-04-25 09:39:23 INFO MemoryStore:54 - MemoryStore started with capacity 
> 366.3 MB
> 2018-04-25 09:39:23 INFO SparkEnv:54 - Registering OutputCommitCoordinator
> 2018-04-25 09:39:23 INFO log:192 - Logging initialized @1825ms
> 2018-04-25 09:39:23 INFO Server:346 - jetty-9.3.z-SNAPSHOT
> 2018-04-25 09:39:23 INFO Server:414 - Started @1900ms
> 2018-04-25 09:39:23 INFO AbstractConnector:278 - Started 
> ServerConnector@6813a331{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
> 2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 'SparkUI' on 
> port 4040.
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@4f7c0be3{/jobs,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@4cfbaf4{/jobs/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@58faa93b{/jobs/job,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@127d7908{/jobs/job/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@6b9c69a9{/stages,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@6622a690{/stages/json,null,AVAILABLE,@Spark}
> 2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
> o.s.j.s.ServletContextHandler@30b9eadd{/stages/stage,null,AVAILABLE,@Spark}
> 2018-04-25 

[jira] [Created] (SPARK-24086) Exception while executing spark streaming examples

2018-04-25 Thread Chandra Hasan (JIRA)
Chandra Hasan created SPARK-24086:
-

 Summary: Exception while executing spark streaming examples
 Key: SPARK-24086
 URL: https://issues.apache.org/jira/browse/SPARK-24086
 Project: Spark
  Issue Type: Bug
  Components: Examples
Affects Versions: 2.3.0
Reporter: Chandra Hasan


After running mvn clean package, I tried to execute one of the spark example 
program JavaDirectKafkaWordCount.java but throws following exeception.
{code:java}
[cloud-user@server-2 examples]$ run-example streaming.JavaDirectKafkaWordCount 
192.168.0.4:9092 msu
2018-04-25 09:39:22 WARN NativeCodeLoader:62 - Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
2018-04-25 09:39:22 INFO SparkContext:54 - Running Spark version 2.3.0
2018-04-25 09:39:22 INFO SparkContext:54 - Submitted application: 
JavaDirectKafkaWordCount
2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls to: cloud-user
2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls to: 
cloud-user
2018-04-25 09:39:22 INFO SecurityManager:54 - Changing view acls groups to:
2018-04-25 09:39:22 INFO SecurityManager:54 - Changing modify acls groups to:
2018-04-25 09:39:22 INFO SecurityManager:54 - SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(cloud-user); 
groups with view permissions: Set(); users with modify permissions: 
Set(cloud-user); groups with modify permissions: Set()
2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 'sparkDriver' 
on port 59333.
2018-04-25 09:39:23 INFO SparkEnv:54 - Registering MapOutputTracker
2018-04-25 09:39:23 INFO SparkEnv:54 - Registering BlockManagerMaster
2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - Using 
org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2018-04-25 09:39:23 INFO BlockManagerMasterEndpoint:54 - 
BlockManagerMasterEndpoint up
2018-04-25 09:39:23 INFO DiskBlockManager:54 - Created local directory at 
/tmp/blockmgr-6fc11fc1-f638-42ea-a9df-dc01fb81b7b6
2018-04-25 09:39:23 INFO MemoryStore:54 - MemoryStore started with capacity 
366.3 MB
2018-04-25 09:39:23 INFO SparkEnv:54 - Registering OutputCommitCoordinator
2018-04-25 09:39:23 INFO log:192 - Logging initialized @1825ms
2018-04-25 09:39:23 INFO Server:346 - jetty-9.3.z-SNAPSHOT
2018-04-25 09:39:23 INFO Server:414 - Started @1900ms
2018-04-25 09:39:23 INFO AbstractConnector:278 - Started 
ServerConnector@6813a331{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2018-04-25 09:39:23 INFO Utils:54 - Successfully started service 'SparkUI' on 
port 4040.
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@4f7c0be3{/jobs,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@4cfbaf4{/jobs/json,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@58faa93b{/jobs/job,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@127d7908{/jobs/job/json,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@6b9c69a9{/stages,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@6622a690{/stages/json,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@30b9eadd{/stages/stage,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@3249a1ce{/stages/stage/json,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@4dd94a58{/stages/pool,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@2f4919b0{/stages/pool/json,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@a8a8b75{/storage,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@75b21c3b{/storage/json,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@72be135f{/storage/rdd,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@155d1021{/storage/rdd/json,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@4bd2f0dc{/environment,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@2e647e59{/environment/json,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started 
o.s.j.s.ServletContextHandler@2c42b421{/executors,null,AVAILABLE,@Spark}
2018-04-25 09:39:23 INFO ContextHandler:781 - Started