hey QiangCai,
thank you for your reply . i have spark 1.6.2. and also tried with
-Dspark.version=1.6.2 . But result is same . Still i am getting same
exception.

Is this exception possibe if i have different scala version?



--
View this message in context: 
http://apache-carbondata-mailing-list-archive.1130556.n5.nabble.com/CatalystAnalysy-tp5129p5137.html
Sent from the Apache CarbonData Mailing List archive mailing list archive at 
Nabble.com.

Reply via email to