Followed is the discussion between Imran and me.

2015-01-18 4:12 GMT+08:00 Chunnan Yao <yaochun...@gmail.com>:

> Thank you for your patience! Im now not so familiar with the mailing list.
> I just clicked "reply" in Gmail, thinking it would be automatically
> attached to the list. I will later post the missed information to the
> spark-user list :) Your suggestions really helps!
>
> 2015-01-18 4:05 GMT+08:00 Imran Rashid <iras...@cloudera.com>:
>
>> ah, that is a very different question.  The point of using Intellij
>> really is not to build the fully packaged binaries -- its really just for
>> the features of the IDE while developing, eg. code navigation, debugger,
>> etc.  Use either sbt or maven for building the packaged binaries.  (There
>> probably is some way to get Intellij to build the packages, by calling
>> maven, but I can't see much advantage in doing that.)
>>
>> The instructions for building are here:
>>
>> https://spark.apache.org/docs/latest/building-spark.html#building-with-sbt
>>
>> eg., for building with maven, you can do:
>>
>> export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M 
>> -XX:ReservedCodeCacheSize=512m"
>>
>> mvn -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -DskipTests clean package
>>
>>
>> or for sbt, you can do:
>>
>> sbt/sbt -Pyarn -Phadoop-2.3 assembly
>>
>>
>>
>> btw, these replies are only going to me, not the spark-user list, not
>> sure if that was intentional?
>>
>> hope this helps,
>> Imran
>>
>> On Sat, Jan 17, 2015 at 11:00 AM, Chunnan Yao <yaochun...@gmail.com>
>> wrote:
>>
>>> Thank you very much for your reply! But how can I generate deployable
>>> spark binary package like those pre-built packages? I am new with Maven.
>>>
>>> 2015-01-18 1:44 GMT+08:00 Imran Rashid <iras...@cloudera.com>:
>>>
>>>> The build output location is set by the maven build which is
>>>>
>>>> <sub-project>/target/scala-<version>/[test-]classes/
>>>>
>>>> eg.
>>>>
>>>> core/target/scala-2.10/classes/
>>>>
>>>>
>>>> On Sat, Jan 17, 2015 at 8:51 AM, Chunnan Yao <yaochun...@gmail.com>
>>>> wrote:
>>>>
>>>>> I don't know if it's a naive question.  Although the fix  (moving the
>>>>> paradise jar from "Additional compiler options" to "compiler
>>>>> plugins") works fine (it has removed all the errors I was faced, but still
>>>>> leaves 83 warnings), but I cannot find my compile results (which should be
>>>>> in the /spark-1.2.0/out dictionary. What's the problem? The compiler has
>>>>> told me the compilation completed successfully.
>>>>>
>>>>> 2015-01-17 23:28 GMT+08:00 Imran Rashid <iras...@cloudera.com>:
>>>>>
>>>>>> I experienced these errors in Intellij even without the hive mode
>>>>>> enabled.  I think its also a question of which project you are trying t
>>>>>> compile.  eg. core built fine, but I got these errors when I tried to 
>>>>>> build
>>>>>> sql.  If the fix works for you (moving the paradise jar from "Additional
>>>>>> compiler options" to "compiler plugins") then we should definitely put it
>>>>>> on the wiki.
>>>>>>
>>>>>> writing macros is really painful with quasiquotes, so it is probably
>>>>>> worth it ...
>>>>>>
>>>>>> On Sat, Jan 17, 2015 at 2:34 AM, Sean Owen <so...@cloudera.com>
>>>>>> wrote:
>>>>>>
>>>>>>> Yes I've seen that error in the past too, and was just talking to
>>>>>>> Imran the other day about it. I thought it only occurred when the
>>>>>>> hive
>>>>>>> module was enabled, which I don't enable.
>>>>>>>
>>>>>>> The problem is that the plugin that causes an error in IntelliJ for
>>>>>>> scalac is what parses these values.* I think he got it to work with
>>>>>>> this change:
>>>>>>> http://stackoverflow.com/questions/26788367/quasiquotes-in-intellij-14/26908554#26908554
>>>>>>>
>>>>>>> If that works for you let's put it on the wiki.
>>>>>>>
>>>>>>> * probably an ignorant question but is this feature important enough
>>>>>>> to warrant the extra scala compiler plugin? the quasiquotes syntax I
>>>>>>> mean.
>>>>>>>
>>>>>>> On Sat, Jan 17, 2015 at 10:29 AM, Chunnan Yao <yaochun...@gmail.com>
>>>>>>> wrote:
>>>>>>> > *I followed the procedures instructed by
>>>>>>> >
>>>>>>> https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-IntelliJ
>>>>>>> .
>>>>>>> > But problems still occurs which has made me a little bit annoyed.
>>>>>>> >
>>>>>>> > My environment settings are:JAVA 1.7.0 Scala: 2.10.4 Spark:1.2.0,
>>>>>>> Intellij
>>>>>>> > Idea 14.0.2, Ubuntu 14.04
>>>>>>> >
>>>>>>> > Firstly I got the scala plugin correctly installed.
>>>>>>> >
>>>>>>> > I choosed maven-3, hadoop-2.4, scala-2.10 as my profiles when
>>>>>>> importing the
>>>>>>> > project.
>>>>>>> >
>>>>>>> > After importing, I first turned on "View-Tool Windows-Maven
>>>>>>> Projects". I see
>>>>>>> > the "hbase-hadoop1" is selected, but I had not chosen it in the
>>>>>>> import
>>>>>>> > process. So I deselected it to leave the hadoop-2.4, maven-3,
>>>>>>> scala-2.10 to
>>>>>>> > be the only three selected items in "Maven Projects-Profiles".
>>>>>>> >
>>>>>>> > According to the Wiki, the next step should be "Generate Sources
>>>>>>> and Update
>>>>>>> > Folders For All Projects". I did so, and waited for some minutes
>>>>>>> to get the
>>>>>>> > sub-projects prepared.
>>>>>>> >
>>>>>>> > Then I cleared the "Additional compiler options" in the
>>>>>>> > "File-Settings-Build, Execution, Deployment-Compiler-Scala
>>>>>>> Compiler".
>>>>>>> >
>>>>>>> > Finally I choosed Build-reBuild project.
>>>>>>> >
>>>>>>> > However, the compiler failed with "value q is not a member of
>>>>>>> stringcontext"
>>>>>>> > errors. *
>>>>>>> >
>>>>>>> > (partial screen shot)
>>>>>>> > -------------------------------------------------------
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/codegen/GenerateProjection.scala
>>>>>>> > Error:(42, 21) value q is not a member of StringContext
>>>>>>> >     val lengthDef = q"final val length = $tupleLength"
>>>>>>> >                     ^
>>>>>>> > Error:(54, 7) value q is not a member of StringContext
>>>>>>> >       q"""
>>>>>>> >       ^
>>>>>>> > Error:(66, 9) value q is not a member of StringContext
>>>>>>> >         q"""
>>>>>>> >         ^
>>>>>>> > Error:(83, 9) value q is not a member of StringContext
>>>>>>> >         q"if(isNullAt($iLit)) { null } else {
>>>>>>> ${newTermName(s"c$i")} }"
>>>>>>> >         ^
>>>>>>> > Error:(85, 7) value q is not a member of StringContext
>>>>>>> >       q"override def iterator = Iterator[Any](..$allColumns)"
>>>>>>> >       ^
>>>>>>> > Error:(88, 27) value q is not a member of StringContext
>>>>>>> >     val accessorFailure = q"""scala.sys.error("Invalid ordinal:" +
>>>>>>> i)"""
>>>>>>> >                           ^
>>>>>>> > Error:(95, 9) value q is not a member of StringContext
>>>>>>> >         q"if(i == $ordinal) { if(isNullAt($i)) return null else
>>>>>>> return
>>>>>>> > $elementName }"
>>>>>>> >         ^
>>>>>>> > Error:(97, 7) value q is not a member of StringContext
>>>>>>> >       q"override def apply(i: Int): Any = { ..$cases;
>>>>>>> $accessorFailure }"
>>>>>>> >       ^
>>>>>>> > Error:(106, 9) value q is not a member of StringContext
>>>>>>> >         q"""
>>>>>>> >         ^
>>>>>>> > Error:(117, 7) value q is not a member of StringContext
>>>>>>> >       q"override def update(i: Int, value: Any): Unit = { ..$cases;
>>>>>>> > $accessorFailure }"
>>>>>>> >       ^
>>>>>>> > Error:(126, 11) value q is not a member of StringContext
>>>>>>> >           q"if(i == $i) return $elementName" :: Nil
>>>>>>> >           ^
>>>>>>> > Error:(130, 7) value q is not a member of StringContext
>>>>>>> >       q"""
>>>>>>> >       ^
>>>>>>> > Error:(143, 11) value q is not a member of StringContext
>>>>>>> >           q"if(i == $i) { nullBits($i) = false; $elementName =
>>>>>>> value; return
>>>>>>> > }" :: Nil
>>>>>>> >           ^
>>>>>>> > Error:(147, 7) value q is not a member of StringContext
>>>>>>> >       q"""
>>>>>>> >       ^
>>>>>>> > Error:(157, 29) value q is not a member of StringContext
>>>>>>> >         case BooleanType => q"if ($elementName) 0 else 1"
>>>>>>> >                             ^
>>>>>>> > Error:(158, 52) value q is not a member of StringContext
>>>>>>> >         case ByteType | ShortType | IntegerType =>
>>>>>>> q"$elementName.toInt"
>>>>>>> >                                                    ^
>>>>>>> > Error:(159, 26) value q is not a member of StringContext
>>>>>>> >         case LongType => q"($elementName ^ ($elementName >>>
>>>>>>> 32)).toInt"
>>>>>>> >                          ^
>>>>>>> > Error:(160, 27) value q is not a member of StringContext
>>>>>>> >         case FloatType =>
>>>>>>> q"java.lang.Float.floatToIntBits($elementName)"
>>>>>>> >                           ^
>>>>>>> > Error:(162, 11) value q is not a member of StringContext
>>>>>>> >           q"{ val b =
>>>>>>> java.lang.Double.doubleToLongBits($elementName); (b ^
>>>>>>> > (b >>>32)).toInt }"
>>>>>>> >           ^
>>>>>>> > Error:(163, 19) value q is not a member of StringContext
>>>>>>> >         case _ => q"$elementName.hashCode"
>>>>>>> >                   ^
>>>>>>> > Error:(165, 7) value q is not a member of StringContext
>>>>>>> >       q"if (isNullAt($i)) 0 else $nonNull"
>>>>>>> >       ^
>>>>>>> > Error:(168, 54) value q is not a member of StringContext
>>>>>>> >     val hashUpdates: Seq[Tree] = hashValues.map(v => q"""result =
>>>>>>> 37 *
>>>>>>> > result + $v""": Tree)
>>>>>>> >                                                      ^
>>>>>>> > Error:(171, 7) value q is not a member of StringContext
>>>>>>> >       q"""
>>>>>>> >       ^
>>>>>>> > Error:(181, 7) value q is not a member of StringContext
>>>>>>> >       q"if (this.$elementName != specificType.$elementName) return
>>>>>>> false"
>>>>>>> >       ^
>>>>>>> > Error:(185, 7) value q is not a member of StringContext
>>>>>>> >       q"""
>>>>>>> >       ^
>>>>>>> > Error:(195, 7) value q is not a member of StringContext
>>>>>>> >       q"""
>>>>>>> >       ^
>>>>>>> > Error:(210, 16) value q is not a member of StringContext
>>>>>>> >     val code = q"""
>>>>>>> >                ^
>>>>>>> >
>>>>>>> ---------------------------------------------------------------------
>>>>>>> > *
>>>>>>> > Then I tried
>>>>>>> >
>>>>>>> http://stackoverflow.com/questions/26995023/errorscalac-bad-option-p-intellij-idea
>>>>>>> ,
>>>>>>> > which does not clear the "Additional compiler options", but change
>>>>>>> the -P in
>>>>>>> > to -Xplugin.
>>>>>>> > So now my "Additional Compiler Options" is like this
>>>>>>> >
>>>>>>> "-Xplugin:/home/yaochunnan/.m2/repository/org/scalamacros/paradise_2.10.4/2.0.1/paradise_2.10.4-2.0.1.jar"
>>>>>>> >
>>>>>>> > Then I reBuild again with the following errors: *
>>>>>>> > (partial screen shot)
>>>>>>> >
>>>>>>> ----------------------------------------------------------------------
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveContext.scala
>>>>>>> > Error:(169, 38) not found: value HiveShim
>>>>>>> >
>>>>>>>  Option(tableParameters.get(HiveShim.getStatsSetupConstTotalSize))
>>>>>>> >                                      ^
>>>>>>> > Error:(177, 31) not found: value HiveShim
>>>>>>> >           tableParameters.put(HiveShim.getStatsSetupConstTotalSize,
>>>>>>> > newTotalSize.toString)
>>>>>>> >                               ^
>>>>>>> > Error:(292, 36) not found: value HiveShim
>>>>>>> >       val proc: CommandProcessor =
>>>>>>> > HiveShim.getCommandProcessor(Array(tokens(0)), hiveconf)
>>>>>>> >                                    ^
>>>>>>> > Error:(304, 25) not found: value HiveShim
>>>>>>> >           val results = HiveShim.createDriverResultsArray
>>>>>>> >                         ^
>>>>>>> > Error:(314, 11) not found: value HiveShim
>>>>>>> >           HiveShim.processResults(results)
>>>>>>> >           ^
>>>>>>> > Error:(418, 7) not found: value HiveShim
>>>>>>> >
>>>>>>>  HiveShim.createDecimal(decimal.toBigDecimal.underlying()).toString
>>>>>>> >       ^
>>>>>>> > Error:(420, 7) not found: value HiveShim
>>>>>>> >       HiveShim.createDecimal(decimal.underlying()).toString
>>>>>>> >       ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveInspectors.scala
>>>>>>> > Error:(97, 7) not found: value HiveShim
>>>>>>> >       HiveShim.toCatalystDecimal(
>>>>>>> >       ^
>>>>>>> > Error:(123, 46) not found: value HiveShim
>>>>>>> >     case hdoi: HiveDecimalObjectInspector =>
>>>>>>> > HiveShim.toCatalystDecimal(hdoi, data)
>>>>>>> >                                              ^
>>>>>>> > Error:(156, 19) not found: value HiveShim
>>>>>>> >       (o: Any) =>
>>>>>>> >
>>>>>>> HiveShim.createDecimal(o.asInstanceOf[Decimal].toBigDecimal.underlying())
>>>>>>> >                   ^
>>>>>>> > Error:(210, 31) not found: value HiveShim
>>>>>>> >         case b: BigDecimal =>
>>>>>>> HiveShim.createDecimal(b.underlying())
>>>>>>> >                               ^
>>>>>>> > Error:(211, 28) not found: value HiveShim
>>>>>>> >         case d: Decimal =>
>>>>>>> > HiveShim.createDecimal(d.toBigDecimal.underlying())
>>>>>>> >                            ^
>>>>>>> > Error:(283, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getStringWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(285, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getIntWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(287, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getDoubleWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(289, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getBooleanWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(291, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getLongWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(293, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getFloatWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(295, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getShortWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(297, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getByteWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(299, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getBinaryWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(301, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getDateWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(303, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getTimestampWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(305, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getDecimalWritableConstantObjectInspector(value)
>>>>>>> >       ^
>>>>>>> > Error:(307, 7) not found: value HiveShim
>>>>>>> >       HiveShim.getPrimitiveNullWritableConstantObjectInspector
>>>>>>> >       ^
>>>>>>> > Error:(363, 51) not found: value HiveShim
>>>>>>> >     case w: WritableHiveDecimalObjectInspector =>
>>>>>>> > HiveShim.decimalTypeInfoToCatalyst(w)
>>>>>>> >                                                   ^
>>>>>>> > Error:(364, 47) not found: value HiveShim
>>>>>>> >     case j: JavaHiveDecimalObjectInspector =>
>>>>>>> > HiveShim.decimalTypeInfoToCatalyst(j)
>>>>>>> >                                               ^
>>>>>>> > Error:(393, 30) not found: value HiveShim
>>>>>>> >       case d: DecimalType => HiveShim.decimalTypeInfo(d)
>>>>>>> >                              ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala
>>>>>>> > Error:(78, 11) not found: value HiveShim
>>>>>>> >           HiveShim.getAllPartitionsOf(client, table).toSeq
>>>>>>> >           ^
>>>>>>> > Error:(205, 7) not found: value HiveShim
>>>>>>> >       HiveShim.setLocation(tbl, crtTbl)
>>>>>>> >       ^
>>>>>>> > Error:(443, 28) not found: value HiveShim
>>>>>>> >     case d: DecimalType => HiveShim.decimalMetastoreString(d)
>>>>>>> >                            ^
>>>>>>> > Error:(472, 53) not found: value HiveShim
>>>>>>> >       val totalSize =
>>>>>>> > hiveQlTable.getParameters.get(HiveShim.getStatsSetupConstTotalSize)
>>>>>>> >                                                     ^
>>>>>>> > Error:(490, 19) not found: value HiveShim
>>>>>>> >   val tableDesc = HiveShim.getTableDesc(
>>>>>>> >                   ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveUdfs.scala
>>>>>>> > Error:(255, 18) not found: type HiveFunctionWrapper
>>>>>>> >     funcWrapper: HiveFunctionWrapper,
>>>>>>> >                  ^
>>>>>>> > Error:(73, 53) not found: type HiveFunctionWrapper
>>>>>>> > private[hive] case class HiveSimpleUdf(funcWrapper:
>>>>>>> HiveFunctionWrapper,
>>>>>>> > children: Seq[Expression])
>>>>>>> >                                                     ^
>>>>>>> > Error:(57, 25) not found: type HiveFunctionWrapper
>>>>>>> >       HiveSimpleUdf(new HiveFunctionWrapper(functionClassName),
>>>>>>> children)
>>>>>>> >                         ^
>>>>>>> > Error:(134, 54) not found: type HiveFunctionWrapper
>>>>>>> > private[hive] case class HiveGenericUdf(funcWrapper:
>>>>>>> HiveFunctionWrapper,
>>>>>>> > children: Seq[Expression])
>>>>>>> >                                                      ^
>>>>>>> > Error:(59, 26) not found: type HiveFunctionWrapper
>>>>>>> >       HiveGenericUdf(new HiveFunctionWrapper(functionClassName),
>>>>>>> children)
>>>>>>> >                          ^
>>>>>>> > Error:(185, 18) not found: type HiveFunctionWrapper
>>>>>>> >     funcWrapper: HiveFunctionWrapper,
>>>>>>> >                  ^
>>>>>>> > Error:(62, 27) not found: type HiveFunctionWrapper
>>>>>>> >       HiveGenericUdaf(new HiveFunctionWrapper(functionClassName),
>>>>>>> children)
>>>>>>> >                           ^
>>>>>>> > Error:(214, 18) not found: type HiveFunctionWrapper
>>>>>>> >     funcWrapper: HiveFunctionWrapper,
>>>>>>> >                  ^
>>>>>>> > Error:(64, 20) not found: type HiveFunctionWrapper
>>>>>>> >       HiveUdaf(new HiveFunctionWrapper(functionClassName),
>>>>>>> children)
>>>>>>> >                    ^
>>>>>>> > Error:(66, 27) not found: type HiveFunctionWrapper
>>>>>>> >       HiveGenericUdtf(new HiveFunctionWrapper(functionClassName),
>>>>>>> Nil,
>>>>>>> > children)
>>>>>>> >                           ^
>>>>>>> > Error:(322, 18) not found: type HiveFunctionWrapper
>>>>>>> >     funcWrapper: HiveFunctionWrapper,
>>>>>>> >                  ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveQl.scala
>>>>>>> > Error:(1132, 15) not found: type HiveFunctionWrapper
>>>>>>> >           new HiveFunctionWrapper(functionName),
>>>>>>> >               ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala
>>>>>>> > Error:(44, 34) object HiveShim is not a member of package
>>>>>>> > org.apache.spark.sql.hive
>>>>>>> > import org.apache.spark.sql.hive.HiveShim._
>>>>>>> >                                  ^
>>>>>>> > Error:(43, 8) object ShimFileSinkDesc is not a member of package
>>>>>>> > org.apache.spark.sql.hive
>>>>>>> > import org.apache.spark.sql.hive.{ ShimFileSinkDesc =>
>>>>>>> FileSinkDesc}
>>>>>>> >        ^
>>>>>>> > Error:(76, 21) not found: type FileSinkDesc
>>>>>>> >       fileSinkConf: FileSinkDesc,
>>>>>>> >                     ^
>>>>>>> > Error:(142, 23) not found: value HiveShim
>>>>>>> >     val tmpLocation = HiveShim.getExternalTmpPath(hiveContext,
>>>>>>> > tableLocation)
>>>>>>> >                       ^
>>>>>>> > Error:(143, 28) not found: type FileSinkDesc
>>>>>>> >     val fileSinkConf = new FileSinkDesc(tmpLocation.toString,
>>>>>>> tableDesc,
>>>>>>> > false)
>>>>>>> >                            ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala
>>>>>>> > Error:(141, 22) not found: value HiveShim
>>>>>>> >       val partPath = HiveShim.getDataLocationPath(partition)
>>>>>>> >                      ^
>>>>>>> > Error:(298, 33) not found: value HiveShim
>>>>>>> >             row.update(ordinal, HiveShim.toCatalystDecimal(oi,
>>>>>>> value))
>>>>>>> >                                 ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/TestHive.scala
>>>>>>> > Error:(384, 3) not found: value HiveShim
>>>>>>> >   HiveShim.createDefaultDBIfNeeded(this)
>>>>>>> >   ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/DescribeHiveTableCommand.scala
>>>>>>> > Error:(29, 8) object HiveShim is not a member of package
>>>>>>> > org.apache.spark.sql.hive
>>>>>>> > import org.apache.spark.sql.hive.HiveShim
>>>>>>> >        ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/HiveTableScan.scala
>>>>>>> > Error:(89, 5) not found: value HiveShim
>>>>>>> >     HiveShim.appendReadColumns(hiveConf, neededColumnIDs,
>>>>>>> > attributes.map(_.name))
>>>>>>> >     ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveWriterContainers.scala
>>>>>>> > Error:(38, 34) object HiveShim is not a member of package
>>>>>>> > org.apache.spark.sql.hive
>>>>>>> > import org.apache.spark.sql.hive.HiveShim._
>>>>>>> >                                  ^
>>>>>>> > Error:(37, 8) object ShimFileSinkDesc is not a member of package
>>>>>>> > org.apache.spark.sql.hive
>>>>>>> > import org.apache.spark.sql.hive.{ShimFileSinkDesc => FileSinkDesc}
>>>>>>> >        ^
>>>>>>> > Error:(174, 19) not found: type FileSinkDesc
>>>>>>> >     fileSinkConf: FileSinkDesc,
>>>>>>> >                   ^
>>>>>>> > Error:(46, 19) not found: type FileSinkDesc
>>>>>>> >     fileSinkConf: FileSinkDesc)
>>>>>>> >                   ^
>>>>>>> > Error:(220, 33) not found: type FileSinkDesc
>>>>>>> >       val newFileSinkDesc = new FileSinkDesc(
>>>>>>> >                                 ^
>>>>>>> >
>>>>>>> /home/yaochunnan/workspace/spark_source/spark-1.2.0/sql/hive/src/main/scala/org/apache/spark/sql/hive/parquet/FakeParquetSerDe.scala
>>>>>>> > Warning:(34, 2) @deprecated now takes two arguments; see the
>>>>>>> scaladoc.
>>>>>>> > @deprecated("No code should depend on FakeParquetHiveSerDe as it
>>>>>>> is only
>>>>>>> > intended as a " +
>>>>>>> >  ^
>>>>>>> >
>>>>>>> ------------------------------------------------------------------------
>>>>>>> >
>>>>>>> > *I thought it was the problem from Maven Profiles. So I tried
>>>>>>> reselecting
>>>>>>> > hbase-hadoop1 or hive or hbase-hadoop2. The error still occurs.
>>>>>>> Please help
>>>>>>> > me. This has annoyed me for a whole afternoon!*
>>>>>>> >
>>>>>>> >
>>>>>>> >
>>>>>>> >
>>>>>>> > -----
>>>>>>> > Feel the sparking Spark!
>>>>>>> > --
>>>>>>> > View this message in context:
>>>>>>> http://apache-spark-developers-list.1001551.n3.nabble.com/Spark-development-with-IntelliJ-tp10032p10163.html
>>>>>>> > Sent from the Apache Spark Developers List mailing list archive at
>>>>>>> Nabble.com.
>>>>>>> >
>>>>>>> >
>>>>>>> ---------------------------------------------------------------------
>>>>>>> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>>>>>> > For additional commands, e-mail: dev-h...@spark.apache.org
>>>>>>> >
>>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Reply via email to