with delete.
>>
>> Could you by chance run just the delete to see if it fails
>>
>> FileSystem.get(sc.hadoopConfiguration)
>> .delete(new Path(somepath), true)
>> --
>> *From:* Ankur Srivastava
>> *Sent:* Thursday, January 5, 2
FileSystem.get(sc.hadoopConfiguration)
.delete(new Path(somepath), true)
From: Ankur Srivastava
mailto:ankur.srivast...@gmail.com>>
Sent: Thursday, January 5, 2017 10:05:03 AM
To: Felix Cheung
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject:
:45:59 PM
> *To:* Felix Cheung; d...@spark.apache.org
>
> *Cc:* user@spark.apache.org
> *Subject:* Re: Spark GraphFrame ConnectedComponents
>
> Adding DEV mailing list to see if this is a defect with ConnectedComponent
> or if they can recommend any solution.
>
> Thanks
.
From: Ankur Srivastava
Sent: Thursday, January 5, 2017 3:45:59 PM
To: Felix Cheung; d...@spark.apache.org
Cc: user@spark.apache.org
Subject: Re: Spark GraphFrame ConnectedComponents
Adding DEV mailing list to see if this is a defect with ConnectedComponent or
if they can recommend any solution
FileSystem.get(sc.hadoopConfiguration)
>> .delete(new Path(somepath), true)
>> --
>> *From:* Ankur Srivastava
>> *Sent:* Thursday, January 5, 2017 10:05:03 AM
>> *To:* Felix Cheung
>> *Cc:* user@spark.apache.org
>>
>> *Sub
> hadoop.fs.FileSystem.
>>
>> Is the URL scheme for s3n registered?
>> Does it work when you try to read from s3 from Spark?
>>
>> _____
>> From: Ankur Srivastava
>> Sent: Wednesday, January 4, 2017 9:23 PM
>> Subject: Re
Felix Cheung
Cc: user@spark.apache.org
Subject: Re: Spark GraphFrame ConnectedComponents
Yes it works to read the vertices and edges data from S3 location and is also
able to write the checkpoint files to S3. It only fails when deleting the data
and that is because it tries to use the default file s
m s3 from Spark?
>
> _
> From: Ankur Srivastava
> Sent: Wednesday, January 4, 2017 9:23 PM
> Subject: Re: Spark GraphFrame ConnectedComponents
> To: Felix Cheung
> Cc:
>
>
>
> This is the exact trace from the driver logs
>
> Exception in thread &quo
day, January 4, 2017 9:23 PM
Subject: Re: Spark GraphFrame ConnectedComponents
To: Felix Cheung mailto:felixcheun...@hotmail.com>>
Cc: mailto:user@spark.apache.org>>
This is the exact trace from the driver logs
Exception in thread "main" java.lang.IllegalArgumentExceptio
pointInterval}"), true)*
> }
>
> System.gc() // hint Spark to clean shuffle directories
> }
>
>
> Thanks
> Ankur
>
> On Wed, Jan 4, 2017 at 5:19 PM, Felix Cheung
> wrote:
>
>> Do you have more of the exception stack?
>>
wrote:
> Do you have more of the exception stack?
>
>
> --
> *From:* Ankur Srivastava
> *Sent:* Wednesday, January 4, 2017 4:40:02 PM
> *To:* user@spark.apache.org
> *Subject:* Spark GraphFrame ConnectedComponents
>
> Hi,
>
> I am tr
Do you have more of the exception stack?
From: Ankur Srivastava
Sent: Wednesday, January 4, 2017 4:40:02 PM
To: user@spark.apache.org
Subject: Spark GraphFrame ConnectedComponents
Hi,
I am trying to use the ConnectedComponent algorithm of GraphFrames but by
Hi,
I am trying to use the ConnectedComponent algorithm of GraphFrames but by
default it needs a checkpoint directory. As I am running my spark cluster
with S3 as the DFS and do not have access to HDFS file system I tried using
a s3 directory as checkpoint directory but I run into below exception:
13 matches
Mail list logo