gt; I'll try to cook up some examples of this today, threaded and not. We
>> were hoping that someone had seen this before and it rung a bell. Maybe
>> there's a setting to clean up info from old jobs that we can adjust.
>>
>> Cheers,
>>
>> Keith.
>>
&g
800'),
>
> The job processes each file in a thread, and we have 10 threads running
> concurrently. The process will OOM after about 4 hours, at which point
> Spark has processed over 20,000 jobs.
>
> It seems like the driver is running out of memory, but each individual jo
Hi,
I am running a Spark standalone cluster with 2 masters: one active, the
other in standby. An application is running on this cluster.
When the active master dies, the standby master becomes active and the
running application reconnects to the newly active master.
The only problem I see is
n use Spark SQL transferring DataFrames back and forth between
> Python and Scala spark can be much easier.
>
>
> On Monday, September 12, 2016, Alexis Seigneurin <aseigneu...@ippon.fr>
> wrote:
>
>> Hi,
>>
>>
>> *TL;DR - I have what looks like a
Hi,
*TL;DR - I have what looks like a DStream of Strings in a PySpark
application. I want to send it as a DStream[String] to a Scala library.
Strings are not converted by Py4j, though.*
I'm working on a PySpark application that pulls data from Kafka using Spark
Streaming. My messages are
/702aa9d7fb16c98a50e046edfd76b8a7861d0391/sql/core/src/main/scala/org/apache/spark/sql/execution/sort.scala#L125
On Wed, Aug 5, 2015 at 9:25 AM, Alexis Seigneurin aseigneu...@ippon.fr
wrote:
Hi,
I'm receiving a memory allocation error with a recent build of Spark 1.5:
java.io.IOException: Unable to acquire 67108864 bytes
Hi,
I'm receiving a memory allocation error with a recent build of Spark 1.5:
java.io.IOException: Unable to acquire 67108864 bytes of memory
at
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.acquireNewPageIfNecessary(UnsafeExternalSorter.java:348)
at
Hi,
I know there are not so many conferences on Spark in Paris, so I just
wanted to let you know you that Ippon will be holding one on Thursday next
week (11th of December):
http://blog.ippon.fr/2014/12/03/ippevent-spark-ou-comment-traiter-des-donnees-a-la-vitesse-de-leclair/
There will be 3