What is the format of the data you are trying to export? The hive table -
is it in textt/parquet or something else?
On Fri, Jul 17, 2015 at 12:25 AM, Kumar Jayapal
wrote:
> Hi James,
>
> I tried with uppercase and it worked now I am getting different error. I
> don't see the .metadata dir in the
Looks like you may get more help from Hive mailing list:
http://hive.apache.org/mailing_lists.html
FYI
On Thu, Jul 16, 2015 at 4:15 PM, Kumar Jayapal wrote:
> Hi,
>
> How can we convert files stored in snappy compressed parquet format in
> Hive to avro format.
>
> is it possible to do it. Can
Hi,
In a simple way, you can create a table in avro format and insert all data from
the old table into the new one.
Please check the "Create Table As Select" section in the following hive
documentation.
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-
I think it's not valid. Multiply it by ULF seems not reasonable, I think it
should be:
max(1, maxApplications * max(userlimit/100, 1/#activeUsers))
Assuming admin setups a very large ULF (e.g. 100), maxApplicationsPerUser
can be much more than maxApplications of a queue.
Also, multiply ULF to co
Hi,
How can we convert files stored in snappy compressed parquet format in Hive
to avro format.
is it possible to do it. Can you please guide me or give me the link which
describe the procdure
Thanks
Jay
Hello,
I have built a jar file with maven as build tool which reads properties
from a file.I am doing this like InputStream is =
ClassLoader.getSystemResourceAsStream(("hadoop.properties")) on start
iup.No qonce the jar is built I could see that the properties get loaded
when I do java -jar
Hi James,
I tried with uppercase and it worked now I am getting different error. I
don't see the .metadata dir in the path.
15/07/16 18:49:12 ERROR sqoop.Sqoop: Got exception running Sqoop:
org.kitesdk.data.DatasetNotFoundException: Descriptor location does not
exist:
hdfs://name/user/hive/wareho
Hello,
I have built a jar file with maven as build tool which reads properties
from a file.I am doing this like InputStream is =
ClassLoader.getSystemResourceAsStream(("hadoop.properties")) on start
iup.No qonce the jar is built I could see that the properties get loaded
when I do java -jar
Hi Folks ,
Came across one scenario where in maxApplications @ cluster level(2 node) was
set to a low value like 10 and based on capacity configuration for a particular
queue it was coming to 2 as value, but further while calculating
maxApplicationsPerUser formula used is :
maxApplicationsPerUs