Hi,

Have you raised a Jira for this?

Thanks

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 10 August 2016 at 08:54, Chanh Le <giaosu...@gmail.com> wrote:

> Hi Gene,
> It's a Spark 2.0 issue.
> I switch to Spark 1.6.1 it's ok now.
>
> Thanks.
>
> On Thursday, July 28, 2016 at 4:25:48 PM UTC+7, Chanh Le wrote:
>>
>> Hi everyone,
>>
>> I have problem when I create a external table in Spark Thrift Server
>> (STS) and query the data.
>>
>> Scenario:
>> *Spark 2.0*
>> *Alluxio 1.2.0 *
>> *Zeppelin 0.7.0*
>> STS start script
>> */home/spark/spark-2.0.0-bin-hadoop2.6/sbin/start-thriftserver.sh
>> --master mesos://zk://master1:2181,master2:2181,master3:2181/mesos --conf
>> spark.driver.memory=5G --conf spark.scheduler.mode=FAIR --class
>> org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 --jars
>> /home/spark/spark-2.0.0-bin-hadoop2.6/jars/alluxio-core-client-spark-1.2.0-jar-with-dependencies.jar
>> --total-executor-cores 35 spark-internal --hiveconf
>> hive.server2.thrift.port=10000 --hiveconf
>> hive.metastore.warehouse.dir=/user/hive/warehouse --hiveconf
>> hive.metastore.metadb.dir=/user/hive/metadb --conf
>> spark.sql.shuffle.partitions=20*
>>
>> I have a file store in Alluxio *alluxio://master2:19998/etl_info/TOPIC*
>>
>> then I create a table in STS by
>> CREATE EXTERNAL TABLE topic (topic_id int, topic_name_vn String,
>> topic_name_en String, parent_id int, full_parent String, level_id int)
>> STORED AS PARQUET LOCATION 'alluxio://master2:19998/etl_info/TOPIC';
>>
>> to compare STS with Spark I create a temp table with name topics
>> spark.sqlContext.read.parquet("alluxio://master2:19998/etl_info/TOPIC
>> ").registerTempTable("topics")
>>
>> Then I do query and compare.
>>
>>
>> As you can see the result is different.
>> Is that a bug? Or I did something wrong
>>
>> Regards,
>> Chanh
>>
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>

Reply via email to