Re: reading Hbase table in Spark

2016-12-08 Thread Asher
Hi
Mich, can you describe the detail about used phoenix read/write hbase table
in spark for RDD's process.
thx



--
View this message in context: 
http://apache-hbase.679495.n3.nabble.com/reading-Hbase-table-in-Spark-tp4083260p4084996.html
Sent from the HBase User mailing list archive at Nabble.com.


Re: Cannot connect to Hbase via Java API

2014-12-18 Thread Asher Devuyst
Are you able to put data to the table,  scan data from the table, and count
the rows all  using the shell?
You mentioned you could start the shell, but not that you had done any
operations with it.  I would start there before using the api.
On Dec 18, 2014 2:08 AM, "Marco"  wrote:

> Ok, btw I've used "HDP 2.2 on Sandbox"
>
> 2014-12-18 4:19 GMT+01:00 Wilm Schumacher :
> > Am 17.12.2014 um 15:27 schrieb Marco:
> >>> Tonight I will take a closer look at the sandbox (never
> >>> used it before). Perhaps I'll find something. But there are some GB to
> >>> download ;).
> >> That would be cool :) Thx.
> > *mumble curses*
> >
> > I cannot import the Hortenwork VM. VM-Import says the image is corrupt.
> > But it could be that the download was interrupted. The file is too
> > small. I'll re-download it and try tomorrow. FYI.
> >
> > Wilm
>
>
>
> --
> Viele Grüße,
> Marco
>


Re: is there any way to copy data from one table to another while updating rowKey??

2013-11-15 Thread Asher
T Vinod Gupta  writes:

> 
> I am badly stuck and can't find a way out. i want to change my rowkey
> schema while copying data from 1 table to another. but a map reduce job to
> do this won't work because of large row sizes (responseTooLarge errors). 
so
> i am left with a 2 steps processing of exporting to hdfs files and
> importing from them to the 2nd table. so i wrote a custom exporter that
> changes the rowkey to newRowKey when doing context.write(newRowKey,
> result). but when i import these new files into new table, it doesnt work
> due to this exception in put - "The row in the recently added ... doesn't
> match the original one ".
> 
> is there no way out for me? please help
> 
> thanks
> 

I know this is old, but here is a solution:

You need to pass the new key in the Put constructor as well as overwrite the 
key values w/ the new key.  Here is a helper method I use to do this...

public static Put resultToPut(byte[] newKey, Result result) throws 
IOException {
Put put = new Put(newKey);
for (KeyValue kv : result.raw()) {
KeyValue kv2 = new KeyValue(newKey, kv.getFamily(), 
kv.getQualifier(), kv.getValue());
put.add(kv2);
}
return put;
}

--Asher