Hi,
Is there any example to read/query orc file using orc file input format
from Map-Reduce job or Spark job?
BR,
Patcharee
:
'hdfs://service-test-1-0.testlocal:8020/apps/hive/warehouse/orc_merge5a/st=0.8/00_0'
to trash at:
hdfs://service-test-1-0.testlocal:8020/user/patcharee/.Trash/Current
Moved:
'hdfs://service-test-1-0.testlocal:8020/apps/hive/warehouse/orc_merge5a/st=0.8/02_0'
to trash at:
hdfs
which could be the cause of the
problem. Please let me know how to fix it.
BR,
Patcharee
On 21. april 2015 13:10, Gopal Vijayaraghavan wrote:
alter table table concatenate do not work? I have a dynamic
partitioned table (stored as orc). I tried to alter concatenate, but it
did not work. See
29075 2015-04-20 15:23
/apps/hive/warehouse/coordinate/zone=2/part-r-2
Any ideas?
BR,
Patcharee
did not.
I would appreciate any suggestions.
BR,
Patcharee
(Unknown Source)
at
org.apache.hadoop.hive.ql.metadata.Hive.alterPartition(Hive.java:469)
... 26 more
BR,
Patcharee
records
matched the condition. What can be wrong? I am using Hive 0.14
BR,
Patcharee
Actually it works on mr. So the problem is from tez. thanks!
BR,
Patcharee
On 30. juni 2015 10:23, Nitin Pawar wrote:
can you try doing same by changing the query engine from tez to mr1?
not sure if its hive bug or tez bug
On Tue, Jun 30, 2015 at 1:46 PM, patcharee patcharee.thong...@uni.no
)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
]
DAG failed due to vertex failure. failedVertices:1 killedVertices:0
FAILED: Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.DDLTask
BR,
Patcharee
character '' not supported here
line 1:141 character '' not supported here
line 1:142 character '' not supported here
line 1:143 character '' not supported here
line 1:144 character '' not supported here
line 1:145 character '' not supported here
line 1:146 character '' not supported here
BR,
Patcharee
This select * from table limit 5; works, but not others. So?
Patcharee
On 18. juli 2015 12:08, Nitin Pawar wrote:
can you do select * from table limit 5;
On Sat, Jul 18, 2015 at 3:35 PM, patcharee patcharee.thong...@uni.no
mailto:patcharee.thong...@uni.no wrote:
Hi,
I am using
data, like select count(*) from Table, any more, just got error line 1:1
character '' not supported here, no matter Tez or MR engine.
How can you solve the problem in your case?
BR,
Patcharee
On 18. juli 2015 21:26, Nitin Pawar wrote:
can you tell exactly what steps you did/?
also did you
the whole table, not one-by-one partition?
Thanks,
Patcharee
while it is supposed to be
- Type: struct<date:int,hh:int,x:int,y:int>
Any ideas how this happened and how I can fix it. Please suggest me.
BR,
Patcharee
Hi,
It work with non-partition ORC, but does not work with (2-column)
partitioned ORC.
Thanks,
Patcharee
On 09. nov. 2015 10:55, Elliot West wrote:
Hi,
You can create a table and point the location property to the folder
containing your ORC file:
CREATE EXTERNAL TABLE orc_table
Hi,
How can I query an orc file (*.orc) by Hive? This orc file is created by
other apps, like spark, mr.
Thanks,
Patcharee
Hi,
For the orc format, which scenario that bloom filter is better than
min-max index?
Best,
Patcharee
Hi,
It works after I altered add partition. Thanks!
My partitioned orc file (directory) is created by Spark, therefore hive
is not aware of the partitions automatically.
Best,
Patcharee
On 13. nov. 2015 13:08, Elliot West wrote:
Have you added the partitions to the meta store?
ALTER TABLE
rom org.apache.hadoop.hive.ql.exec.DDLTask. GC overhead limit
exceeded (state=08S01,code=1)
How to solve this? How to identify if this error is from the client
(beeline) or from hiveserver2?
Thanks,
Patcharee
Hi,
How can I import .sql file into hive?
Best, Patcharee
I exported sql table into .sql file and would like to import this into hive
Best, Patcharee
On 23. nov. 2016 10:40, Markovitz, Dudu wrote:
Hi Patcharee
The question is not clear.
Dudu
-Original Message-
From: patcharee [mailto:patcharee.thong...@uni.no]
Sent: Wednesday, November 23
After I changed org.apache.hcatalog.pig.HCatStorer() to
org.apache.hive.hcatalog.pig.HCatStorer(), it worked.
Patcharee
On 01/14/2015 02:57 PM, Patcharee Thongtra wrote:
Hi,
I am having a weird problem. I created a table in orc format:
Create table
Hi,
I have a hive table with a column which was changed its name. Pig is not
able to load data from this column, it is all empty.
Any ideas how to fix it?
BR,
Patcharee
taken: 14.262 seconds
hive select long from test_float;
select long from test_float
Status: Finished successfully
OK
-41.338276
Time taken: 6.843 seconds, Fetched: 1 row(s)
Any ideas? I am using hive version 0.13.
BR,
Patcharee
It works. Thanks!
Patcharee
On 01/13/2015 10:15 AM, Devopam Mittra wrote:
please try the following and report observation:
WHERE long = CAST(-41.338276 AS FLOAT)
regards
Devopam
On Tue, Jan 13, 2015 at 2:25 PM, Patcharee Thongtra
patcharee.thong...@uni.no mailto:patcharee.thong...@uni.no
?
BR,
Patcharee
ddl page, it seems
only bucket table can be sorted.
Any suggestions please
BR,
Patcharee
It works on Hive cli
Patcharee
On 10/24/2016 11:51 AM, Mich Talebzadeh wrote:
does this work ok through Hive cli?
Dr Mich Talebzadeh
LinkedIn
/https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw/
http://talebzadehmich.wordpress.com
*Disclaimer:* Use
nder why I got this error because I query just ONE line. Any ideas?
Thanks,
Patcharee
29 matches
Mail list logo