Re: JdbcRDD and ClassTag issue

2015-07-20 Thread nitinkalra2000
Thanks Sujee :)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/JdbcRDD-and-ClassTag-issue-tp18570p23912.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: JdbcRDD and ClassTag issue

2015-01-01 Thread Sujee
Hi,
I encountered the same issue and solved it. Please check my blog post
http://www.sparkexpert.com/2015/01/02/load-database-data-into-spark-using-jdbcrdd-in-java/

Thank you



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/JdbcRDD-and-ClassTag-issue-tp18570p20938.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



JdbcRDD and ClassTag issue

2014-11-11 Thread nitinkalra2000
Hi All,

I am trying to access SQL Server through JdbcRDD. But getting error on
ClassTag place holder.

Here is the code which I wrote


public void readFromDB() {
String sql   = Select * from Table_1 where values = ? and 
values = ?;

class GetJDBCResult extends AbstractFunction1ResultSet, 
Integer {
public Integer apply(ResultSet rs) {
Integer result = null;
try {
result = rs.getInt(1);
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return result;  
   }

}

JdbcRDDInteger jdbcRdd = new JdbcRDDInteger(sc, 
jdbcInitialization(),
sql, 0, 120, 2, new GetJDBCResult(),
scala.reflect.ClassTag$.MODULE$.apply(Object.class));

}

Can anybody here recommend any solution to this ?

Thanks
Nitin




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/JdbcRDD-and-ClassTag-issue-tp18570.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org