Hi All,

I have a SQL file of size 30mb which is a single create table statement with 
about 800,000 columns, hence the size. 

I am trying to execute it using hive -f 
<file>. Initially, hive ran the command with 256mb heap size and 
gave me an OOM error. I increased the heap size using export 
HADOOP_HEAPSIZE to 1 gb and eventually 2gb which made the OOM error go 
away. However, the hive command ran for 5 hours without actually 
creating the table. The JVM was running.
However,
1. running a strace on the process showed that it was stuck on a futex call.
2. I am using mysql for metastore and there were no rows added to either TBLS 
or COLUMNS table.

Question.
1. can hive do this create table of 800k columns from a sql file of 30mb?
2. if theoretically possible, what could be happening that's taking it over 5 
hours and still not succeeding?
Running it with debug, it spews the following,
---snip ----

stored as textfile location '<myfile>'
12/01/16 11:28:54 INFO parse.ParseDriver: Parse Completed
12/01/16 11:28:54 INFO parse.SemanticAnalyzer: Starting Semantic Analysis
12/01/16 11:28:54 INFO parse.SemanticAnalyzer: Creating table my_table 
position=22
----

and it's stuck there at SemanticAnalyzer...

thanks for any insight.

sincerely,

ameet

Reply via email to