Any help regarding this issue please?

Regards,
Arpit

On Sat, Oct 25, 2014 at 8:56 AM, Arpit Kumar <arp8...@gmail.com> wrote:

> Hi all,
> I am using the GrpahLoader class to load graphs from edge list files. But
> then I need to change the storage level of the graph to some other thing
> than MEMORY_ONLY.
>
> val graph = GraphLoader.edgeListFile(sc, fname,
>           minEdgePartitions =
> numEPart).persist(StorageLevel.MEMORY_AND_DISK_SER)
>
> The error I am getting while executing this is:
> Exception in thread "main" java.lang.UnsupportedOperationException: Cannot
> change storage level of an RDD after it was already assigned a level
>
>
> Then I looked into the GraphLoader class. I know that in the latest
> version of spark support for setting persistence level is provided in this
> class. Please suggest a workaround for spark 1.0.0 as I do not have the
> option to shift to latest release.
>
> Note: I tried copying the GraphLoader class to my package as GraphLoader1
> importing
>
> package com.cloudera.xyz
>
> import org.apache.spark.storage.StorageLevel
> import org.apache.spark.graphx._
> import org.apache.spark.{Logging, SparkContext}
> import org.apache.spark.graphx.impl._
>
> and then changing the persistence level to my suitability as
> .persist(gStorageLevel) instead of .cache()
>
> But while compiling I am getting the following errors
>
> GraphLoader1.scala:49: error: class EdgePartitionBuilder in package impl
> cannot be accessed in package org.apache.spark.graphx.impl
> [INFO]       val builder = new EdgePartitionBuilder[Int, Int]
>
> I am also attaching the file with the mail. Maybe this way of doing thing
> is not possible.
>
>
> Please suggest some workarounds so that I can set persistence level of my
> graph to MEMORY_AND_DISK_SER for the graph I read from edge file list
>



-- 
Arpit Kumar
Fourth Year Undergraduate
Department of Computer Science and Engineering
Indian Institute of Technology, Kharagpur

Reply via email to