Hi Gabor,

here it is.

#!/usr/bin/python

import igraph

vertices = 2**25
edges = 100
g = igraph.Graph.Barabasi(vertices, edges)
g.write("%s-graph.txt" % vertices, format="edgelist")


On Sat, Jul 27, 2013 at 3:21 PM, Gábor Csárdi <[email protected]>wrote:

> The memory you need for a graph is (2*n + 4*m) * 8 bytes, where 'n' is the
> number of vertices and 'm' is the number of edges. So if the density of the
> graph is not too large, this should be possible. Can you send some code
> that reproduces this?
>
> Gabor
>
> On Sat, Jul 27, 2013 at 8:37 AM, Claudio Martella <
> [email protected]> wrote:
>
>> Hello guys,
>>
>> I've been trying to build a graph with over 32M vertices using Barabasi()
>> via the python wrapper. The machine I'm currently using has around 160G of
>> main memory but the process is being OOM-killed by the kernel. I was able
>> to generate a 16M vertices graph with a usage of around 50% of memory. Do
>> you guys have some suggestions about how i can generate such large graphs
>> (potentially in a distributed fasion), as I'm actually aiming also at
>> larger ones?
>>
>> Thanks,
>> Claudio
>>
>> --
>>    Claudio Martella
>>    [email protected]
>>
>> _______________________________________________
>> igraph-help mailing list
>> [email protected]
>> https://lists.nongnu.org/mailman/listinfo/igraph-help
>>
>>
>
> _______________________________________________
> igraph-help mailing list
> [email protected]
> https://lists.nongnu.org/mailman/listinfo/igraph-help
>
>


-- 
   Claudio Martella
   [email protected]
_______________________________________________
igraph-help mailing list
[email protected]
https://lists.nongnu.org/mailman/listinfo/igraph-help

Reply via email to