help!!
Thanks, Harshvardhan Chauhan Software Engineer |
GumGum | Ads that stick
310-260-9666 | ha...@gumgum.com
newline and… stackoverflow.com
Harshvardhan Chauhan | Software Engineer
GumGum | Ads that stick
310-260-9666 | ha...@gumgum.com
--
*Harshvardhan Chauhan* | Software Engineer
*GumGum* <http://www.gumgum.com/> | *Ads that stick*
310-260-9666 | ha...@gumgum.com
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/S3-SubFolder-Write-Issues-tp21997.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>>
tional commands, e-mail: user-h...@spark.apache.org
>
>
--
*Harshvardhan Chauhan* | Software Engineer
*GumGum* <http://www.gumgum.com/> | *Ads that stick*
310-260-9666 | ha...@gumgum.com
le.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
--
*Harshvardhan Chauhan* | Software Engineer
*GumGum* <http://www.gumgum.com/> | *Ads that stick*
310-260-9666 | ha...@gumgum.com
sm etc.
>
> Thanks
> Best Regards
>
> On Tue, Feb 17, 2015 at 3:39 AM, Harshvardhan Chauhan
> wrote:
>
>> Hi All,
>>
>>
>> I need some help with Out Of Memory errors in my application. I am using
>> Spark 1.1.0 and my application is using Java
doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.j