Please note:
The name of hbase table is specified in:

  def writeCatalog = s"""{
                    |"table":{"namespace":"default", "name":"table1"},

not by the:

HBaseTableCatalog.newTable -> "5"

FYI

On Tue, May 10, 2016 at 3:11 PM, Ted Yu <yuzhih...@gmail.com> wrote:

> I think so.
>
> Please refer to the table population tests in (master branch):
>
> hbase-spark/src/test/scala/org/apache/hadoop/hbase/spark/DefaultSourceSuite.scala
>
> Cheers
>
> On Tue, May 10, 2016 at 2:53 PM, Benjamin Kim <bbuil...@gmail.com> wrote:
>
>> Ted,
>>
>> Will the hbase-spark module allow for creating tables in Spark SQL that
>> reference the hbase tables underneath? In this way, users can query using
>> just SQL.
>>
>> Thanks,
>> Ben
>>
>> On Apr 28, 2016, at 3:09 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>> Hbase 2.0 release likely would come after Spark 2.0 release.
>>
>> There're other features being developed in hbase 2.0
>> I am not sure when hbase 2.0 would be released.
>>
>> The refguide is incomplete.
>> Zhan has assigned the doc JIRA to himself. The documentation would be
>> done after fixing bugs in hbase-spark module.
>>
>> Cheers
>>
>> On Apr 27, 2016, at 10:31 PM, Benjamin Kim <bbuil...@gmail.com> wrote:
>>
>> Hi Ted,
>>
>> Do you know when the release will be? I also see some documentation for
>> usage of the hbase-spark module at the hbase website. But, I don’t see an
>> example on how to save data. There is only one for reading/querying data.
>> Will this be added when the final version does get released?
>>
>> Thanks,
>> Ben
>>
>> On Apr 21, 2016, at 6:56 AM, Ted Yu <yuzhih...@gmail.com> wrote:
>>
>> The hbase-spark module in Apache HBase (coming with hbase 2.0 release)
>> can do this.
>>
>> On Thu, Apr 21, 2016 at 6:52 AM, Benjamin Kim <bbuil...@gmail.com> wrote:
>>
>>> Has anyone found an easy way to save a DataFrame into HBase?
>>>
>>> Thanks,
>>> Ben
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>>
>>
>

Reply via email to