Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
@srowen
@jerryshao
I understand, thank you.
I changed the code in my project to keep the program consistent with the
latest example.
---
If your project is set up for it, you ca
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/17580
In some cases you have to call a Scala API from Java and need to create
Scala-specific classes like Tuple2.
---
If your project is set up for it, you can reply to this email and have your
reply appe
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/17580
It is just Java8 lambda function, nothing related to Scala...
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does n
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
JavaDStream lines = messages.map(new Function, String>() {
@Override
public String call(Tuple2 tuple2) {
return tuple2._2();
}
});
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
![Uploading image.pngâ¦]()
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/17580
You mean this
[line](https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/streaming/JavaKafkaWordCount.java#L76)?
It's because our KafkInputDStream
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
eg
messages.map(Tuple2::_2) is the Scala grammar code.But
JavaKafkaWordCount is a java class.
I think the Java example program should be fully implemented in Java
code.
Wh
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/17580
What's the meaning of "some of the Scala"?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this featur
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
@jerryshao
Excuse me,I would like to ask, these examples of Java examples, why use
some of the Scala to write it?thank you!
SparkConf sparkConf = new
SparkConf
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/17580
I think this is not worth bothering with for the reasons above, and can be
closed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
@jerryshao
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.clients.producer.ProducerRe
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/17580
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feat
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/17580
I think I'd go with @jerryshao and not add it if it's an old, auxiliary
example anyway
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well.
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
Because of the API changes of Kafka, we do not want to delete it, but to
maintain and modify it.
Although it is absolutely a Kafka producer code, but this is part of spark
streaming, it i
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/17580
I would say this is not a Spark program, it is absolutely a Kafka producer
code. To maintain a Kafka Producer example in Spark is not a good choice, this
is a legacy code. Because of the API chang
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
When a user use spark to develop a stream&kafka application, he first
wants to find and learn example program in 'spark \ examples \ src \ main \
java \ org \ apache \ spark \ examples \ st
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/17580
What is the purpose of adding this example? I think we already have a
`KafkaWordCountProducer` for the convenience of Kafka streaming example, and we
could use that to send events to Kafka. I thin
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
Sorry,spark java code style is different from the style of my project
team.Now I know, and have been fixed.
Use 2-space indentation in general. For function declarations, use 4 s
Github user guoxiaolongzte commented on the issue:
https://github.com/apache/spark/pull/17580
Title,PR description and motive, has been modified.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not ha
19 matches
Mail list logo