output the datas(txt)

2016-02-27 Thread Bonsen
I get results from RDDs, like : Array(Array(1,2,3),Array(2,3,4),Array(3,4,6)) how can I output them to 1.txt like : 1 2 3 2 3 4 3 4 6 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/output-the-datas-txt-tp26350.html Sent from the Apache Spark User List

deal with datas' structure

2016-02-27 Thread Bonsen
Now,I have a map val ji = scala.collection.mutable.Map[String,scala.collection.mutable.ArrayBuffer[String]]() there are so many datas like: ji = map("a"->ArrayBuffer["1","2","3"],"b"->ArrayBuffer["1","2","3"],"c"->ArrayBuffer["2","3"]) if "a" choose "1","b" and "c" can't choose "1", for

When I merge some datas,can't go on...

2016-02-25 Thread Bonsen
I have a file,like 1.txt: 1 2 1 3 1 4 1 5 1 6 1 7 2 4 2 5 2 7 2 9 I want to merge them,results like this map(1->List(2,3,4,5,6,7),2->List(4,5,7,9)) what should I do?。。 val file1=sc.textFile("1.txt") val q1=file1.flatMap(_.split(' '))???,maybe I should change RDD[int] to RDD[int,int]? --

sql:Exception in thread "main" scala.MatchError: StringType

2016-01-03 Thread Bonsen
(sbt) scala: import org.apache.spark.SparkContext import org.apache.spark.SparkConf import org.apache.spark.sql object SimpleApp { def main(args: Array[String]) { val conf = new SparkConf() conf.setAppName("mytest").setMaster("spark://Master:7077") val sc = new SparkContext(conf)

get parameters of spark-submit

2015-12-21 Thread Bonsen
1.I code my scala class and pack.(not input the hdfs files' paths,just use the paths from "spark-submit"'s parameters) 2.Then,If I input like this: ${SPARK_HOME/bin}/spark-submit \ --master \ \ hdfs:// \ hdfs:// \ what should I do to get the two hdfs files' paths in my scala class's code(before

worker:java.lang.ClassNotFoundException: ttt.test$$anonfun$1

2015-12-14 Thread Bonsen
package ttt import org.apache.spark.SparkConf import org.apache.spark.SparkContext object test { def main(args: Array[String]) { val conf = new SparkConf() conf.setAppName("mytest") .setMaster("spark://Master:7077") .setSparkHome("/usr/local/spark")

Re:Re: HELP! I get "java.lang.String cannot be cast to java.lang.Intege " for a long time.

2015-12-11 Thread Bonsen
Thank you,and I find the problem is my package is test,but I write package org.apache.spark.examples ,and IDEA had imported the spark-examples-1.5.2-hadoop2.6.0.jar ,so I can run it,and it makes lots of problems __ Now , I

Re: HELP! I get "java.lang.String cannot be cast to java.lang.Intege " for a long time.

2015-12-10 Thread Bonsen
I do like this "val secondData = rawData.flatMap(_.split("\t").take(3))" and I find: 15/12/10 22:36:55 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, 219.216.65.129): java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Integer at