Ok, thanks, just wanted to make sure I wasn't missing something
obvious. I've worked with Neo4j cypher as well, where it was rather
more obvious.
e.g. http://neo4j.com/docs/milestone/query-match.html#_shortest_path
http://neo4j.com/docs/stable/cypher-refcard/
Dino.
On 6 October 2015 at 06:43, Ro
Ah thanks, got it working with that.
e.g.
val (_,smap)=shortest.vertices.filter(_._1==src).first
smap.contains(dest)
Is there anything a little less eager?
i.e. that doesn't compute all the distances from all source nodes, where I
can supply the source vertex id, dest vertex id, and just get a
e.g.
http://gremlindocs.spmallette.documentup.com/#finding-edges-between-vertices
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/GraphX-How-can-I-tell-if-2-nodes-are-connected-tp24926p24929.html
Sent from the Apache Spark User List mailing list archive
Is there an existing api to see if 2 nodes in a graph are connected?
e.g. a->b, b->c, c->d
can I get to d, starting from a? (yes I hope!)
I'm not asking the route, just want to know if there is a route.
Thanks.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble
I'm using Windows.
Are you saying it works with Windows?
Dino.
On 29 August 2015 at 09:04, Akhil Das wrote:
> You can also mount HDFS through the NFS gateway and access i think.
>
> Thanks
> Best Regards
>
> On Tue, Aug 25, 2015 at 3:43 AM, Dino Fancellu wrote:
>&
ocuments/HDP1/HDP-1.2.1/bk_reference/content/reference_chap2_1.html
> to see the complete list of ports that also need to be tunnelled.
>
>
>
> 2015-08-24 13:10 GMT-07:00 Dino Fancellu :
>>
>> Changing the ip to the guest IP address just never connects.
>>
>> The VM
http://hortonworks.com/blog/windows-explorer-experience-hdfs/
Seemed to exist, now now sign.
Anything similar to tie HDFS into windows explorer?
Thanks,
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Where-is-Redgate-s-HDFS-explorer-tp24431.html
Sent fro
I am not sure how the default HDP VM is set up, that is, if it only
> binds HDFS to 127.0.0.1 or to all addresses. You can check that with netstat
> -a.
>
> R.
>
> 2015-08-24 11:46 GMT-07:00 Dino Fancellu :
>>
>> I have a file in HDFS inside my HortonWorks HDP 2.3_1 Vi
I have a file in HDFS inside my HortonWorks HDP 2.3_1 VirtualBox VM.
If I go into the guest spark-shell and refer to the file thus, it works fine
val words=sc.textFile("hdfs:///tmp/people.txt")
words.count
However if I try to access it from a local Spark app on my Windows host, it
doesn't wo