Basically, the error is as such:
Our error is that when we use data of list length > 600ish, for some reason sage hangs. We know that this is not because we are impatient, and that it is actually working, because when the list length is under 600ish, it generates almost instantly. We want to be able to go to list lengths of over 2000, so this proves to be a deal breaker. We were originally using all of the node names as hash strings of the content, but we optimized it so we have smaller corresponding numbers that relate to each piece of content that we are interested in. Still, this has yielded no significant change. We think that this either is a problem with the allowed length of lists in Sage, but python has an allowable list length of around 2 billion, so that seems unlikely. We also wonder if there is possibly some memory cap that is reached with our data, but if it is that, the memory cap seems unreasonably small. Either way, any suggestions/workarounds would be greatly appreciated. If there is another library that would be better suited to these needs, just naming that would help us out a ton. Thanks much everyone! Devin, Ian, and Max [EMAIL PROTECTED] [EMAIL PROTECTED] [EMAIL PROTECTED] Sample data of what we have been putting into sage can be downloaded here: http://www.devingaffney.com/files/data.txt We are trying to build a program as stipulated here: http://www.devingaffney.com/wikipedia-network-maps. The basic thing we are inputting into sage when it hangs is this: g = [data.txt file that is mentioned above] //this is where the hang occurs; even declaring g with "too much" data makes sage hang up. G = Graph(g) G.show() --~--~---------~--~----~------------~-------~--~----~ To post to this group, send email to sage-support@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sage-support URLs: http://www.sagemath.org -~----------~----~----~----~------~----~------~--~---