Hi,

I'd like to use a JavaRDD containing parameters for an SQL query, and use
SparkSQL jdbc to load data from mySQL.

Consider the following pseudo code:

JavaRDD<String> namesRdd = ... ;
...
options.put("url", "jdbc:mysql://mysql?user=usr");
options.put("password", "pass");
options.put("dbtable", "(SELECT * FROM mytable WHERE userName = ?)
sp_campaigns");
DataFrame myTableDF = m_sqlContext.load("jdbc", options);


I'm looking for a way to map namesRdd and get for each name the result of
the queries, without loosing spark context.

Using a mapping function doesn't seem like an option, because I don't have
SQLContext inside it.
I can only think of using collect, and than iterating over the string in
the RDD and execute the query, but it would run in the driver program.

Any suggestions?

Thanks,
Lior

Reply via email to