//apache-spark-user-list.1001560.n3.nabble.com/Spark-1-4-RDD-to-DF-fails-with-toDF-tp23499p25874.html
> To unsubscribe from Spark 1.4 RDD to DF fails with toDF(), click here
> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code=23499
Good point. It is a pre-compiled Spark version. Based on the text on the
downloads page, the answer to your question is no, so I will download the
sources and recompile.
Thanks!
On Tue, Sep 8, 2015 at 5:17 AM, Koert Kuipers wrote:
> is /opt/spark-1.4.1-bin-hadoop2.6 a spark
Compiling from source with Scala 2.11 support fixed this issue. Thanks
again for the help!
On Tue, Sep 8, 2015 at 7:33 AM, Gheorghe Postelnicu <
gheorghe.posteln...@gmail.com> wrote:
> Good point. It is a pre-compiled Spark version. Based on the text on the
> downloads page, the answer to your
Hi,
The following code fails when compiled from SBT:
package main.scala
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
object TestMain {
def main(args: Array[String]): Unit = {
implicit val sparkContext = new SparkContext()
val sqlContext = new
Try adding the following to your build.sbt
libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.6"
I believe that spark shades the scala library, and this is a library
that it looks like you need in an unshaded way.
2015-09-07 16:48 GMT-04:00 Gheorghe Postelnicu <
How are you building and running it?
El lunes, 7 de septiembre de 2015, Gheorghe Postelnicu <
gheorghe.posteln...@gmail.com> escribió:
> Interesting idea. Tried that, didn't work. Here is my new SBT file:
>
> name := """testMain"""
>
> scalaVersion := "2.11.6"
>
> libraryDependencies ++= Seq(
>
sbt assembly; $SPARK_HOME/bin/spark-submit --class main.scala.TestMain
--master "local[4]" target/scala-2.11/bof-assembly-0.1-SNAPSHOT.jar
using Spark:
/opt/spark-1.4.1-bin-hadoop2.6
On Mon, Sep 7, 2015 at 10:20 PM, Jonathan Coveney
wrote:
> How are you building and
My error was related to Scala version. Upon further reading, I realized
that it takes some effort to get Spark working with Scala 2.11.
I've reverted to using 2.10 and moved past that error. Now I hit the issue
you mentioned. Waiting for 1.4.1.
Srikanth
On Fri, Jun 26, 2015 at 9:10 AM, Roberto
Its a scala version conflict, can you paste your build.sbt file?
Thanks
Best Regards
On Fri, Jun 26, 2015 at 7:05 AM, stati srikanth...@gmail.com wrote:
Hello,
When I run a spark job with spark-submit it fails with below exception for
code line
/*val webLogDF =
Those provided spark libraries are compatible with scala 2.11?
Thanks
Best Regards
On Fri, Jun 26, 2015 at 4:48 PM, Srikanth srikanth...@gmail.com wrote:
Thanks Akhil for checking this out. Here is my build.sbt.
name := Weblog Analysis
version := 1.0
scalaVersion := 2.11.5
javacOptions
Thanks Akhil for checking this out. Here is my build.sbt.
name := Weblog Analysis
version := 1.0
scalaVersion := 2.11.5
javacOptions ++= Seq(-source, 1.7, -target, 1.7)
libraryDependencies ++= Seq(
org.apache.spark %% spark-core % 1.4.0 % provided,
org.apache.spark %% spark-sql % 1.4.0,
I got a similar issue. Might your as well be related to this
https://issues.apache.org/jira/browse/SPARK-8368 ?
On Fri, Jun 26, 2015 at 2:00 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:
Those provided spark libraries are compatible with scala 2.11?
Thanks
Best Regards
On Fri, Jun 26,
Hello,
When I run a spark job with spark-submit it fails with below exception for
code line
/*val webLogDF = webLogRec.toDF().select(ip, date, name)*/
I had similar issue running from spark-shell, then realized that I needed
sqlContext.implicit._
Now my code has the following imports
/*
13 matches
Mail list logo