Hi Amal,

If you have a lot of data to import in batch, you might want to check this
out: http://predictionio.incubator.apache.org/datacollection/batchimport/

The Python script goes through the Event Server, which uses the Storage
API. The HBase driver code is here:
https://github.com/apache/incubator-predictionio/blob/develop/data/src/main/scala/org/apache/predictionio/data/storage/hbase/HBPEvents.scala

Regards,
Donald

On Sun, Oct 23, 2016 at 11:28 PM, amal kumar <amal.kmr.si...@gmail.com>
wrote:

>
> Hi,
>
> I am using Hbase as the data store for PIO and using the below command to
> import the sample data into the Hbase event table.
>
> *python data/import_eventserver.py --access_key $ACCESS_KEY*
>
> I need to setup a Spark job to load data to Hbase Event table and hence, I
> want to understand, how the data is inserted in the Hbase Table (like
> add()/put()) by pio.
>
> Please suggest where can I see the code for the same used in pio.
>
>
> Thanks,
> Amal
>
>

Reply via email to