Apr 5, 2009 at 10:11 AM
Edited Apr 5, 2009 at 10:22 AM

Dear NodeXL users,
First of all i really enjoy using NodeXL because for it's easy use and menu's :). I'm analyzing a social network for 18500 vertices and their contacts, with 825000 aggregated edges and their weight in total. Excel can import untill a million cases, so that
is not the point :).
My question is the following:
My computer is struggling to read the workbook and calculating centrality measures as betweenness and closure centrality.
Is there something i can do about that?
Thanks in advance,
Riko



Hello, Riko:
I'm afraid NodeXL isn't designed to handle 825,000 edges. Our target graph size is in the thousands of vertices and thousands of edges. Although some users have used it to visualize and analyze graphs with tens of thousands of objects (with a good deal of
patience), hundreds of thousands is way too many for NodeXL to handle. The performance bottlenecks are in the time and memory required to read that many rows from an Excel worksheet, and in the nature of the algorithms used to calculate the centrality measures.
There are a few filtering techniques you can use in Excel to pare large graphs down to a manageable size, but that won't help you if you need to analyze a large graph in its entirety.
Sorry about that!
 Tony



Dear Tony,
Thanks for your quick answer.
My main object is to calculate the centrality measures as betweenness, closeness and eigenvector centrality. The visualisation is not that necessary to me :), i can make a graph to illustrate the social network with only 10,000 edges as well :).
This weekend i tried a 125,000 and the computer has done the analysis for betweenness. Now my computer is working on a 400,000 edges, hope it will not crash tonight :).
Riko

