There is a map function in clojure so you could map one collection to
other.
The most resembling operation is *each*, however when f applied to input
tuple we get tuple with two fields* f([field-a]) =
[ field-a field-b]*.
How could I realize the same operation on trident stream?
to manually
grapple with ClassTag from Java for example.
There is not an implicit conversion since it is used from Java, which
doesn't have implicits.
On Fri, Feb 13, 2015 at 5:57 AM, Vladimir Protsenko
protsenk...@gmail.com wrote:
Thank's for reply. I solved porblem with importing
-4397)
On Thu, Feb 12, 2015 at 9:36 AM, Vladimir Protsenko protsenk...@gmail.com
wrote:
Hi. I am stuck with how to save file to hdfs from spark.
I have written MyOutputFormat extends FileOutputFormatString, MyObject,
then in spark calling this:
rddres.saveAsHadoopFile[MyOutputFormat](hdfs
GMT+04:00 Ted Yu yuzhih...@gmail.com:
You can use JavaPairRDD which has:
override def wrapRDD(rdd: RDD[(K, V)]): JavaPairRDD[K, V] =
JavaPairRDD.fromRDD(rdd)
Cheers
On Thu, Feb 12, 2015 at 7:36 AM, Vladimir Protsenko protsenk...@gmail.com
wrote:
Hi. I am stuck with how to save file
Hi. I am stuck with how to save file to hdfs from spark.
I have written MyOutputFormat extends FileOutputFormatString, MyObject,
then in spark calling this:
rddres.saveAsHadoopFile[MyOutputFormat](hdfs://localhost/output) or
rddres.saveAsHadoopFile(hdfs://localhost/output, classOf[String],
it
as it seems spark-shell just calls spark-shell2 anyway...
On Thu, Jan 22, 2015 at 3:16 AM, Vladimir Protsenko
protsenk...@gmail.com wrote:
I have a problem with running spark shell in windows 7. I made the
following
steps:
1. downloaded and installed Scala 2.11.5
2. downloaded spark
I have a problem with running spark shell in windows 7. I made the following
steps:
1. downloaded and installed Scala 2.11.5
2. downloaded spark 1.2.0 by git clone git://github.com/apache/spark.git
3. run dev/change-version-to-2.11.sh and mvn -Dscala-2.11 -DskipTests clean
package (in git bash)
, that I wrongly send only to Sean. I have tried export
MAVEN_OPTS=`-Xmx=3g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=1g` and it
doesn't work also.
Best Regards,
Vladimir Protsenko
2014-12-23 19:45 GMT+04:00 Guru Medasani gdm...@outlook.com:
Thanks for the clarification Sean.
Best Regards,
Guru
Thanks. Bad mistake.
2014-12-24 14:02 GMT+04:00 Sean Owen so...@cloudera.com:
That command is still wrong. It is -Xmx3g with no =.
On Dec 24, 2014 9:50 AM, Vladimir Protsenko protsenk...@gmail.com
wrote:
Java 8 rpm 64bit downloaded from official oracle site solved my problem.
And I need
I am installing Spark 1.2.0 on CentOS 6.6. Just downloaded code from github,
installed maven and trying to compile system:
git clone https://github.com/apache/spark.git
git checkout v1.2.0
mvn -DskipTests clean package
leads to OutOfMemoryException. What amount of memory does it requires?
10 matches
Mail list logo