In my case I want to reach HBase. For every record with userId I want to
get some extra information about the user and add it to result record for
further prcessing


On Thu, Jul 24, 2014 at 9:11 AM, Yanbo Liang <yanboha...@gmail.com> wrote:

> If you want to connect to DB in program, you can use JdbcRDD (
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/rdd/JdbcRDD.scala
> )
>
>
> 2014-07-24 18:32 GMT+08:00 Yosi Botzer <yosi.bot...@gmail.com>:
>
> Hi,
>>
>> I am using the Java api of Spark.
>>
>> I wanted to know if there is a way to run some code in a manner that is
>> like the setup() and cleanup() methods of Hadoop Map/Reduce
>>
>> The reason I need it is because I want to read something from the DB
>> according to each record I scan in my Function, and I would like to open
>> the DB connection only once (and close it only once).
>>
>> Thanks
>>
>
>

Reply via email to