Re: Odd cell result

2018-06-11 Thread Kang Minwoo
Thank you for giving me a good way.
But HBaseContext does not seem to exist in version 1.2.6
is it available in version 1.2.6?

And the problem that I can not use HBaseContext is that I am using 
CustomTableInputFormat which extends TableInputFormat.

Best regards,
Minwoo Kang


보낸 사람: Juan Jose Escobar 
보낸 날짜: 2018년 6월 9일 토요일 18:36
받는 사람: user@hbase.apache.org
제목: Re: Odd cell result

Hello,

Are you trying to read exported files or similar? Otherwise I think you
need to indicate the format of the data you are reading. I think what you
want to do is easier like this:

val sparkConf = new SparkConf()

val sc = new SparkContext(sparkConf)

val conf = HBaseConfiguration.create()
val hbaseContext = new HBaseContext(sc, conf)
val scan = new Scan()
// ... scan config
val rdd = hbaseContext.hbaseRDD(TableName.valueOf(tableName), scan)
rdd.count()

or use a Spark-HBase connector which encapsulates the details

Regards


On Sat, Jun 9, 2018 at 8:48 AM, Kang Minwoo  wrote:

> 1) I am using just InputFormat. (I do not know it is the right answer to
> the question.)
>
> 2) code snippet
>
> ```
> val rdd = sc.newAPIHadoopFile(...)
> rdd.count()
> ```
>
> 3) hbase version 1.2.6
>
> Best regards,
> Minwoo Kang
>
> 
> 보낸 사람: Ted Yu 
> 보낸 날짜: 2018년 6월 8일 금요일 20:01
> 받는 사람: hbase-user
> 제목: Re: Odd cell result
>
> Which connector do you use for Spark 2.1.2 ?
>
> Is there any code snippet which may reproduce what you experienced ?
>
> Which hbase release are you using ?
>
> Thanks
>
> On Fri, Jun 8, 2018 at 1:50 AM, Kang Minwoo 
> wrote:
>
> > Hello, Users
> >
> > I recently met an unusual situation.
> > That is the cell result does not contain column family.
> >
> > I thought the cell is the smallest unit where data could be transferred
> in
> > HBase.
> > But cell does not contain column family means the cell is not the
> smallest
> > unit.
> > I'm wrong?
> >
> > It occurred in Spark 2.1.2 and did not occur in MR.
> > And now it is not reappearance.
> >
> > Best regards,
> > Minwoo Kang
> >
>


Re: Odd cell result

2018-06-11 Thread Ted Yu
bq. is it available in version 1.2.6?

If you were talking
about 
hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala
, it is not in 1.2.6 since hbase-spark module was not part of 1.2.6 release.

FYI

On Mon, Jun 11, 2018 at 2:08 AM, Kang Minwoo 
wrote:

> Thank you for giving me a good way.
> But HBaseContext does not seem to exist in version 1.2.6
> is it available in version 1.2.6?
>
> And the problem that I can not use HBaseContext is that I am using
> CustomTableInputFormat which extends TableInputFormat.
>
> Best regards,
> Minwoo Kang
>
> 
> 보낸 사람: Juan Jose Escobar 
> 보낸 날짜: 2018년 6월 9일 토요일 18:36
> 받는 사람: user@hbase.apache.org
> 제목: Re: Odd cell result
>
> Hello,
>
> Are you trying to read exported files or similar? Otherwise I think you
> need to indicate the format of the data you are reading. I think what you
> want to do is easier like this:
>
> val sparkConf = new SparkConf()
> 
> val sc = new SparkContext(sparkConf)
>
> val conf = HBaseConfiguration.create()
> val hbaseContext = new HBaseContext(sc, conf)
> val scan = new Scan()
> // ... scan config
> val rdd = hbaseContext.hbaseRDD(TableName.valueOf(tableName), scan)
> rdd.count()
>
> or use a Spark-HBase connector which encapsulates the details
>
> Regards
>
>
> On Sat, Jun 9, 2018 at 8:48 AM, Kang Minwoo 
> wrote:
>
> > 1) I am using just InputFormat. (I do not know it is the right answer to
> > the question.)
> >
> > 2) code snippet
> >
> > ```
> > val rdd = sc.newAPIHadoopFile(...)
> > rdd.count()
> > ```
> >
> > 3) hbase version 1.2.6
> >
> > Best regards,
> > Minwoo Kang
> >
> > 
> > 보낸 사람: Ted Yu 
> > 보낸 날짜: 2018년 6월 8일 금요일 20:01
> > 받는 사람: hbase-user
> > 제목: Re: Odd cell result
> >
> > Which connector do you use for Spark 2.1.2 ?
> >
> > Is there any code snippet which may reproduce what you experienced ?
> >
> > Which hbase release are you using ?
> >
> > Thanks
> >
> > On Fri, Jun 8, 2018 at 1:50 AM, Kang Minwoo 
> > wrote:
> >
> > > Hello, Users
> > >
> > > I recently met an unusual situation.
> > > That is the cell result does not contain column family.
> > >
> > > I thought the cell is the smallest unit where data could be transferred
> > in
> > > HBase.
> > > But cell does not contain column family means the cell is not the
> > smallest
> > > unit.
> > > I'm wrong?
> > >
> > > It occurred in Spark 2.1.2 and did not occur in MR.
> > > And now it is not reappearance.
> > >
> > > Best regards,
> > > Minwoo Kang
> > >
> >
>


[ANNOUNCE] Apache Phoenix 4.14 released

2018-06-11 Thread James Taylor
The Apache Phoenix team is pleased to announce the immediate availability
of the 4.14.0 release. Apache Phoenix enables SQL-based OLTP and
operational analytics for Apache Hadoop using Apache HBase as its backing
store and providing integration with other projects in the Apache ecosystem
such as Spark, Hive, Pig, Flume, and MapReduce.

Highlights of the release include:

* Support for HBase 1.4
* Support for CDH 5.11.2, 5.12.2, 5.13.2, and 5.14.2
* Support for GRANT and REVOKE commands
* Secondary index improvements

For more details, visit our blog here [1] and download source and binaries
here [2].

Thanks,
James (on behalf of the Apache Phoenix team)

[1] https://blogs.apache.org/phoenix/entry/announcing-phoenix-4-14-released
[2] http://phoenix.apache.org/download.html


HBaseConAsia 2018 CFP still open

2018-06-11 Thread Yu Li
Hi All,

HBaseConAsia2018[1] will be held on Aug. 17th in Beijing, China, and the
call for proposals is still open[2]. So far we have received proposals from
many big companies like Alibaba/Xiaomi/Huawei/DiDi/Intel/Pintrest etc. and
we are expecting more!

HBaseConAsia2018[1]将于8月17日在中国北京召开,现仍在面向全球征集演讲议题[2]。目前为止我们已经收到阿里巴巴/小米/华为/滴滴/英特尔/Pintrest等国内外多家公司的投稿,但我们期待更多人的参与!

This year we don't only have track on HBase internal (dev&ops), but also on
HBase ecology/solution/applications. We encourage all users building
solution/application on top of HBase to contribute a talk[3] and share your
experience. We are also preparing some live broadcasting so the
presentation could probably receive far more audience.

今年大会的主题不仅限于HBase内部(开发/运维)议题,还包括HBase生态/解决方案/应用。
我们鼓励所有基于HBase构建解决方案(内部或者云上)或者业务应用的用户提交议题
[3],来和大家分享相关的经验。我们也在积极准备并很有可能在大会当天提供线上直播,这将极大的增加大会观众的人数和影响力。

Currently only abstracts are required for PC to review and if accepted
there should be enough time to prepare the presentation. And we hope the
abstracts/slides written in English but not required, so please don't be
blocked by language.

目前仅需要提交摘要即可,大会组织委员会审核并接收之后(有一定的淘汰率)有充足的时间准备演讲。另外虽然我们希望提交的摘要/演讲ppt使用英文撰写,但
并不强制要求,因此希望大家不要因为语言的问题阻止自己提交议题的脚步。

More details please refer to our official website[1]. Thanks and please
start submitting your talks!

更多细节请关注官方主页[1]。各位,是时侯表演真正的技术了,请踊跃提交议题吧!谢谢大家!

- Yu (on behalf of the HBase PMC)

[1] https://hbase.apache.org/hbaseconasia-2018/
[2] *https://easychair.org/cfp/hbaseconasia-2018
*
[3] https://easychair.org/conferences/?conf=hbaseconasia2018