I suspect this is happening because the underlying table has got delta
files in it due to updates etc and spark cannot read it and requires
compaction

Can you do

hdfs dfs -ls  <LOCATION of table in HFDS>


Also can you query a normal table in hive (meaning non transactional)

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 8 August 2016 at 19:51, Deepak Sharma <deepakmc...@gmail.com> wrote:

> Can you please post the code snippet and the error you are getting ?
>
> -Deepak
>
> On 9 Aug 2016 12:18 am, "manish jaiswal" <manishsr...@gmail.com> wrote:
>
>> Hi,
>>
>> I am not able to read data from hive transactional table using sparksql.
>> (i don't want read via hive jdbc)
>>
>>
>>
>> Please help.
>>
>

Reply via email to