Bulk loading through HFiles

2015-06-05 Thread Dawid
tions. The classes can be accessed at: https://github.com/dawidwys/gate/blob/master/src/main/scala/pl/edu/pw/elka/phoenix/BulkPhoenixLoader.scala https://github.com/dawidwys/gate/blob/master/src/main/scala/pl/edu/pw/elka/phoenix/ExtendedProductRDDFunctions.scala Thanks in advance Dawid Wysakowicz

Re: Bulk loading through HFiles

2015-06-05 Thread Dawid
Yes I can see it in hbase-shell. Sorry for the bad links, i haven't used private repositories on github. So I moved the files to a gist: https://gist.github.com/dawidwys/3aba8ba618140756da7c Hope this times it will work. On 05.06.2015 23:09, Ravi Kiran wrote: Hi Dawid, Do you see the

Re: Bulk loading through HFiles

2015-06-08 Thread Dawid
Any suggestions? Some clues what to check? On 05.06.2015 23:21, Dawid wrote: Yes I can see it in hbase-shell. Sorry for the bad links, i haven't used private repositories on github. So I moved the files to a gist: https://gist.github.com/dawidwys/3aba8ba618140756da7c Hope this times it

Re: Bulk loading through HFiles

2015-06-08 Thread Dawid
Yes, I did. I also tried to execute some upserts using sqlline after importing HFiles, and rows from upserts are visible both in sqlline and hbase shell, but the rows imported from HFile are only in hbase shell. On 08.06.2015 19:06, James Taylor wrote: Dawid, Perhaps a dumb question, but

Re: Bulk loading through HFiles

2015-06-10 Thread Dawid
TABLE' only the upserted one disappears. The loaded from HFiles still persist in HBase. Yiannis how do you generate the HFiles? You can see my code here: https://gist.github.com/dawidwys/3aba8ba618140756da7c On 10.06.2015 17:57, Yiannis Gkoufas wrote: Hi Dawid, I am trying to do the same thing

Re: Bulk loading through HFiles

2015-06-10 Thread Dawid
his is the case, you can try specifying the CURRENT_SCN property at connection time with a timestamp later than the timestamp of the rows/cells to verify. Thanks, James On Wed, Jun 10, 2015 at 10:14 AM, Dawid wrote: Yes, that's right I have generated HFile's that I managed to load so to

Re: Bulk loading through HFiles

2015-06-11 Thread Dawid
I need the full stack trace. On 11.06.2015 00:53, Yiannis Gkoufas wrote: Hi Dawid, yes I have been using your code. Probably I am invoking the classes in a wrong way. valdata = readings.map(e => e.split(",")) .map(e => (e(0),e(1).toLong,e(2).toDouble,e(3).toDouble)) valtabl

Problem with arrays in phoenix-spark

2015-11-30 Thread Dawid Wysakowicz
ion. The tricky part is with Array[Byte] as this would be same for both VARBINARY and TINYINT[]. Let me know If I should create an issue for this, and if my solution satisfies you. Regards Dawid Wysakowicz From 5d24874cd0b2d15618843ada221634fa2a371d35 Mon Sep 17 00:00:00 2001 From:

Re: Problem with arrays in phoenix-spark

2015-12-01 Thread Dawid Wysakowicz
Sure, I have done that. https://issues.apache.org/jira/browse/PHOENIX-2469 2015-11-30 22:22 GMT+01:00 Josh Mahonin : > Hi David, > > Thanks for the bug report and the proposed patch. Please file a JIRA and > we'll take the discussion there. > > Josh > > On Mon, N

Re: Bulk loading through HFiles

2015-06-16 Thread Dawid Wysakowicz
27;t realize that I only sent to Dawid. > Resending to the entire list in case someone else has encountered this > error before: > > 15/06/10 23:45:16 WARN TaskSetManager: Lost task 34.48 in stage 0.0 (TID > 816, iriclusnd20): java.io.IOException: Added a key not lexically