I haven't learned Scala yet so as you might imagine I'm having challenges 
working with Spark from the Java API. For one thing, it seems very limited in 
comparison to Scala. I ran into a problem really quick. I need to hydrate an 
RDD from JDBC/Oracle and so I wanted to use the JdbcRDD. But that is part of 
the spark api and I'm unable to get the compiler to accept various parameters. 
I looked at the code and I noticed that JdbcRDD doesn't add much value and just 
implements compute and partition. I figured I can do that myself with better 
looking JDBC code. So I created a class inheriting from RDD that was heavily 
decorated with stuff I have never seen before. Next, I recalled that I have to 
use the JavaRDD. Of course, that class doesn't have those methods that you can 
override. 
>From where I'm standing right now, it really appears that Spark doesn't really 
>support Java and that if you really want to use it you need to learn Scala. Is 
>this a correct assessment? 

                                          

Reply via email to