What's relationship between the TensorflowOnSpark core modules?
Hi, After reading the mnist example and the API of TensorflowOnSpark, I somehow got confused, here are some questions: 1、 What's the relationship between TFCluster/TFManager/TFNode and TFSparkNode modules. 2、The conversion guide says we should replace the main function with a main_fun, but the examples actually defines a map function, are they the same thing? 3、map_function parameter question: when the map_function is called, argv is the sys.args, what about the ctx parameter, what's the type of ctx, and how its value got assigned,I can see the ctx object has various properties and even functions . 4、The conversion guide says there should be step 3 and step 4, but the mnist example does not do these things 5、By the way, where can I post questions about TFoS, I can not join the official google group. I am sorry if I post this in a wrong place.
Re: Does Pyspark Support Graphx?
When using the --jars option, we should include it every time we submit a job , it seems add the jars to the classpath to every slave node a spark is only way to "install" spark packages. -- Original -- From: Nicholas Hakobian Date: Tue,Feb 20,2018 3:37 AM To: xiaobo Cc: Denny Lee , user@spark.apache.org Subject: Re: Does Pyspark Support Graphx? If you copy the Jar file and all of the dependencies to the machines, you can manually add them to the classpath. If you are using Yarn and HDFS you can alternatively use --jars and point it to the hdfs locations of the jar files and it will (in most cases) distribute them to the worker nodes at job submission time. Nicholas Szandor Hakobian, Ph.D.Staff Data Scientist Rally Health nicholas.hakob...@rallyhealth.com On Sun, Feb 18, 2018 at 7:24 PM, xiaobo wrote: Another question is how to install graphframes permanently when the spark nodes can not connect to the internet. -- Original -- From: Denny Lee Date: Mon,Feb 19,2018 10:23 AM To: xiaobo Cc: user@spark.apache.org Subject: Re: Does Pyspark Support Graphx? Note the --packages option works for both PySpark and Spark (Scala). For the SparkLauncher class, you should be able to include packages ala: spark.addSparkArg("--packages", "graphframes:0.5.0-spark2.0-s_2.11") On Sun, Feb 18, 2018 at 3:30 PM xiaobo wrote: Hi Denny, The pyspark script uses the --packages option to load graphframe library, what about the SparkLauncher class? -- Original -- From: Denny Lee Date: Sun,Feb 18,2018 11:07 AM To: 94035420 Cc: user@spark.apache.org Subject: Re: Does Pyspark Support Graphx? That??s correct - you can use GraphFrames though as it does support PySpark. On Sat, Feb 17, 2018 at 17:36 94035420 wrote: I can not find anything for graphx module in the python API document, does it mean it is not supported yet?
Re: [graphframes]how Graphframes Deal With BidirectionalRelationships
So the question comes to does graphframes support bidirectional relationship natively with only one edge? -- Original -- From: Felix Cheung Date: Tue,Feb 20,2018 10:01 AM To: xiaobo , user@spark.apache.org Subject: Re: [graphframes]how Graphframes Deal With BidirectionalRelationships Generally that would be the approach. But since you have effectively double the number of edges this will likely affect the scale your job will run. From: xiaobo Sent: Monday, February 19, 2018 3:22:02 AM To: user@spark.apache.org Subject: [graphframes]how Graphframes Deal With Bidirectional Relationships Hi, To represent a bidirectional relationship, one solution is to insert two edges for the vertices pair, my question is do the algorithms of graphframes still work when we doing this. Thanks
Re: Does Pyspark Support Graphx?
Another question is how to install graphframes permanently when the spark nodes can not connect to the internet. -- Original -- From: Denny Lee Date: Mon,Feb 19,2018 10:23 AM To: xiaobo Cc: user@spark.apache.org Subject: Re: Does Pyspark Support Graphx? Note the --packages option works for both PySpark and Spark (Scala). For the SparkLauncher class, you should be able to include packages ala: spark.addSparkArg("--packages", "graphframes:0.5.0-spark2.0-s_2.11") On Sun, Feb 18, 2018 at 3:30 PM xiaobo wrote: Hi Denny, The pyspark script uses the --packages option to load graphframe library, what about the SparkLauncher class? -- Original -- From: Denny Lee Date: Sun,Feb 18,2018 11:07 AM To: 94035420 Cc: user@spark.apache.org Subject: Re: Does Pyspark Support Graphx? That??s correct - you can use GraphFrames though as it does support PySpark. On Sat, Feb 17, 2018 at 17:36 94035420 wrote: I can not find anything for graphx module in the python API document, does it mean it is not supported yet?
[graphframes]how Graphframes Deal With Bidirectional Relationships
Hi, To represent a bidirectional relationship, one solution is to insert two edges for the vertices pair, my question is do the algorithms of graphframes still work when we doing this. Thanks
Re: Does Pyspark Support Graphx?
Hi Denny, The pyspark script uses the --packages option to load graphframe library, what about the SparkLauncher class? -- Original -- From: Denny Lee Date: Sun,Feb 18,2018 11:07 AM To: 94035420 Cc: user@spark.apache.org Subject: Re: Does Pyspark Support Graphx? That??s correct - you can use GraphFrames though as it does support PySpark. On Sat, Feb 17, 2018 at 17:36 94035420 wrote: I can not find anything for graphx module in the python API document, does it mean it is not supported yet?
Can Precompiled Stand Alone Python Application Submitted To A Spark Cluster?
Hi, To pretect the IP of our software distributed to customers, one solution is to use precompiled python scriptes, but we are wondering whether this is a supported feature by pyspark. Thanks.
Re: Does Pyspark Support Graphx?
Thanks Denny, will it be supported in the near future? -- Original -- From: Denny Lee Date: Sun,Feb 18,2018 11:05 AM To: 94035420 Cc: user@spark.apache.org Subject: Re: Does Pyspark Support Graphx? That??s correct - you can use GraphFrames though as it does support PySpark. On Sat, Feb 17, 2018 at 17:36 94035420 wrote: I can not find anything for graphx module in the python API document, does it mean it is not supported yet?