[Moving the conversation to Apache mailing list:
[email protected]]
Ken,
Using CDH3U1 version of Hive and the latest sources from Sqoop trunk,
I was able to successfully import a table from MySQL containing a bit
field into Hive. I did this using defaults and only specifying the
--hive-import option.
One thing you can look at to troubleshoot this further is the hive.log
file that gets generated under /tmp/${user}/ directory. This file
should identify if there are any exceptions during the load. You can
also look at the directory by the name of the table under
/user/hive/warehouse on HDFS to see the contents of the imported data.
Thanks,
Arvind
On Tue, Aug 16, 2011 at 1:12 PM, Ken <[email protected]> wrote:
> I have been trying to get a mysql table to import into hive using
> sqoop. I have tried many variations of fields-terminated-by/lines-
> terminated-by, mysql-delimiters, etc. however cannot get the bit
> fields to show up in hive. The bit fields always end up as NULL. I
> have tried both --direct and not. I have tried sqoop'ing it into HDFS
> and then moving it. I have tried --hive-import --hive-overwrite but
> nothing seems to work. What am I missing?
>
> The bit fields get created in hive as boolean.
>
> Any suggestions/pointer would be more helpful.
>
> Much thanks.
>
>
> --
> NOTE: The mailing list [email protected] is deprecated in favor of
> Apache Sqoop mailing list [email protected]. Please subscribe
> to it by sending an email to [email protected].
>