Hi Amal;

 

Me coming from relational database (Oracle, Sybase) background J always
expect that a DDL statement like DROP TABLE has to run in its own
transaction and cannot be combined with a DML statement.

 

Now I suspect that when you run the command DROP TABLE IF EXIASTS
<TABLE_NAME>;  like below in beehive it works

 

0: jdbc:hive2://rhes564:10010/default> drop table if exists mytest;

No rows affected (0.216 seconds)

 

That runs in its own transaction so it works. However, I suspect in JAVA
that is not the case. Can you possibly provide your JAVA code to see what
exactly it is doing.

 

Thanks,

 

Mich

 

http://talebzadehmich.wordpress.com

 

Publications due shortly:

Creating in-memory Data Grid for Trading Systems with Oracle TimesTen and
Coherence Cache

 

NOTE: The information in this email is proprietary and confidential. This
message is for the designated recipient only, if you are not the intended
recipient, you should destroy it immediately. Any information in this
message shall not be understood as given or endorsed by Peridale Ltd, its
subsidiaries or their employees, unless expressly so stated. It is the
responsibility of the recipient to ensure that this email is virus free,
therefore neither Peridale Ltd, its subsidiaries nor their employees accept
any responsibility.

 

From: Amal Gupta [mailto:amal.gup...@aexp.com] 
Sent: 21 March 2015 18:16
To: user@hive.apache.org
Subject: RE: Executing HQL files from JAVA application.

 

Hi Mich, 

 

Thank you for your response.  I was not aware of beeline. I have now
included this in my app and this looks a much better solution going forward.
In the last couple of hours I have tried to work with beeline but have been
facing some issues. 

 

1.       I was able to run on the remote server command line a beeline
command given below . This was successful. 

beeline -u jdbc:hive2://<server-name>:10000/test_db
org.apache.hive.jdbc.HiveDriver -n ***** -p ****** -f
/hive/scripts/demoHiveTable.hql

 

2.       Running the same from the java app results in the issues.  My
script contains a drop table for the demoTable but the table is not dropped
when running from java.   (DROP TABLE IF EXISTS demoHiveTable;)  . I see the
following logs. 

 

SLF4J: This version of SLF4J requires log4j version 1.2.12 or later. See
also http://www.slf4j.org/codes.html#log4j_version

Exception in thread "Thread-1" java.lang.NoSuchMethodError:
org.apache.hive.jdbc.HiveStatement.hasMoreLogs()Z

       at org.apache.hive.beeline.Commands$1.run(Commands.java:839)

       at java.lang.Thread.run(Thread.java:662)

Connecting to jdbc:hive2: ://<server-name>:10000/test_db 

Connected to: Hive (version 0.12-mapr-1401-140130)

Driver: Hive (version 0.12-mapr-1401-140130)

Transaction isolation: TRANSACTION_REPEATABLE_READ

0: jdbc:hive2://<server-name>:10000/test_db > DROP TABLE IF EXISTS
demoHiveTable;

No rows affected (0.797 seconds)

org.apache.hive.jdbc.HiveStatement.getQueryLog()Ljava/util/List;

 

Closing: 0: jdbc:hive2://<server-name>:10000/test_db

 

I tried running the following commands 

String[] args =
{"-d",BeeLine.BEELINE_DEFAULT_JDBC_DRIVER,"-u","jdbc:hive2://<server-name:10
000/test_db",

                                 "-n","******","-p","******",
"-f","C://Work//test_hive.hql"}; 

And

 

String[] args = {"beeline","-u","jdbc:hive2://
://<server-name:10000/test_db”,

 
"org.apache.hive.jdbc.HiveDriver","-n","******","-p","******","-f","C://Work
//test_hive.hql"};

 

3.       The link I referred was from googlesource
(https://apache.googlesource.com/hive/+/a681b57609ae306c7623491549cc0176b6cd
f38d/beeline/src/test/org/apache/hive/beeline/src/test/TestBeeLineWithArgs.j
ava) 

4.       I also see that there are errors being propagated back to the
application.  Is there some configuration to get detailed logs too? 

 

Not sure if I am missing something here.  

 

Any pointers or assistance will be of great help.  

 

Regards,

Amal

 

From: Mich Talebzadeh [mailto:m...@peridale.co.uk] 
Sent: Saturday, March 21, 2015 6:59 PM
To: user@hive.apache.org
Subject: RE: Executing HQL files from JAVA application.

 

Hi Amal,

 

Do you have hiveserver2 running?

 

You can use beeline to execute the query outside of JAVA

 

beeline -u jdbc:hive2://rhes564:10010/default
org.apache.hive.jdbc.HiveDriver -n hduser -p xxxx' -f
./create_index_on_t.sql

 

And the output shows there as well.

 

scan complete in 10ms

Connecting to jdbc:hive2://rhes564:10010/default

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in
[jar:file:/home/hduser/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log
4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in
[jar:file:/usr/lib/hive/lib/hive-jdbc-0.14.0-standalone.jar!/org/slf4j/impl/
StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

Connected to: Apache Hive (version 0.14.0)

Driver: Hive JDBC (version 0.14.0)

Transaction isolation: TRANSACTION_REPEATABLE_READ

0: jdbc:hive2://rhes564:10010/default> use asehadoop;

No rows affected (0.084 seconds)

0: jdbc:hive2://rhes564:10010/default> --drop index t_ui on t;

0: jdbc:hive2://rhes564:10010/default> create index t_ui on table t
(object_id) as 'COMPACT' WITH DEFERRED REBUILD;

 

HTH

 

Mich Talebzadeh

 

http://talebzadehmich.wordpress.com

 

Publications due shortly:

Creating in-memory Data Grid for Trading Systems with Oracle TimesTen and
Coherence Cache

 

NOTE: The information in this email is proprietary and confidential. This
message is for the designated recipient only, if you are not the intended
recipient, you should destroy it immediately. Any information in this
message shall not be understood as given or endorsed by Peridale Ltd, its
subsidiaries or their employees, unless expressly so stated. It is the
responsibility of the recipient to ensure that this email is virus free,
therefore neither Peridale Ltd, its subsidiaries nor their employees accept
any responsibility.

 

From: Amal Gupta [mailto:amal.gup...@aexp.com] 
Sent: 21 March 2015 09:05
To: user@hive.apache.org
Subject: Executing HQL files from JAVA application.

 

Hi Everyone,

 

I am trying to execute a hive *.hql file from a java application.  I had
tried a couple of ways of doing it through JDBC driver for hive and through
spring jdbc template but yet, the only way which was successful for me was
to create a runtime process and then execute it. 

 

The java code to do that is  

 

Process p = Runtime.getRuntime().exec((new
String[]{"hive","-f","/hive/scripts/demoHiveTable.hql"})); 

 

Although this works but, I wanted to know if there is better way to do it
using some driver functionality or any other api for hive. The method I used
will do the execution for me, but yet, in case of failures while executing
the hql script, the same will not be visible to the Java application. 

 

Any pointers or suggestions will be greatly helpful. Thanks in advance. 

 

Please let me know in case you need any other details from me. 

 

Regards,

Amal

  _____  

American Express made the following annotations 

  _____  


"This message and any attachments are solely for the intended recipient and
may contain confidential or privileged information. If you are not the
intended recipient, any disclosure, copying, use, or distribution of the
information included in this message and any attachments is prohibited. If
you have received this communication in error, please notify us by reply
e-mail and immediately and permanently delete this message and any
attachments. Thank you." 

American Express a ajouté le commentaire suivant le 
Ce courrier et toute pièce jointe qu'il contient sont réservés au seul
destinataire indiqué et peuvent renfermer des renseignements confidentiels
et privilégiés. Si vous n'êtes pas le destinataire prévu, toute divulgation,
duplication, utilisation ou distribution du courrier ou de toute pièce
jointe est interdite. Si vous avez reçu cette communication par erreur,
veuillez nous en aviser par courrier et détruire immédiatement le courrier
et les pièces jointes. Merci. 

  _____  

Reply via email to