Hi, That got resolved. It was more of a spark-mongodb issue with spark 1.3.
Thank you.

On Thu, May 21, 2015 at 4:52 AM, moon soo Lee <m...@apache.org> wrote:

> Hi
>
> I have experience about spark-mongodb works.
> but it looks like, spark worker is trying to access http://
> <mysystemurl>/jars/Query-1.0.jar and mysystemurl is returning 504.
>
> Could you explain little bit more about what is Query-1.0.jar and how it
> is added in your code?
>
> Thanks,
> moon
>
> On Mon, May 11, 2015 at 3:40 PM Roopa Prabhu <rooparu...@gmail.com> wrote:
>
>> Here's my code (loaded the required jar using %dep):
>>
>> import sys.process._
>> import test.{testData1,QueryData}
>> import com.stratio.deep.mongodb._
>>
>> QueryData.statusCheck() //*goes through fine*
>> val readConfig = QueryData.runMongoJob() //*goes through fine*
>> val rdd = sqlContext.fromMongoDB(readConfig) //*fails*
>> rdd.registerTempTable("metadata").
>>
>> "val rdd = sqlContext.fromMongoDB(readConfig)"
>> OR
>> QueryData.OtherMethod(sc, sqlContext):
>> Any of the above two attempts with sc/sqlContext throws the following
>> exception:
>>
>> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
>> in stage 9.0 failed 1 times, most recent failure: Lost task 0.0 in stage
>> 9.0 (TID 4, localhost): java.io.IOException: Server returned HTTP response
>> code: 504 for URL: http://<mysystemurl>/jars/Query-1.0.jar
>>
>>
>>

Reply via email to