Maximum number of nodes

Jun 8 at 8:53 PM
I am sure this has been asked before, but I may just be entering the wrong search terms. IF this has been asked (and how could it not) I apologize.

What is the maximum number of nodes (or edges if that is the more relevant number) in a network, before NodeXL starts really slowing down or even ceasing to perform. If I had information about say a maximum number of 10 most important connections identified by each person in an organization (node), how big an organization could I work with - 500 nodes? 1,000? 10 K? More? I could see us having data sets with 100K or more.

I realize this may in part be influenced by the power of the computer; the computers we would use would be comparable to a Dell Ultrabook 2.3GHZ Intel processor, 16 GB RAM, 64 bit.
Jun 10 at 5:18 PM

Thank you for the interest in NodeXL!

NodeXL network scale and capacity is very much a function of CPU and particularly RAM.

A fast CPU is useful, but RAM is the major limiting factor.

NodeXL will operate with small networks within 4GB.

Most users are recommended to use at least 8GB of RAM.

NodeXL can routinely handle 10K Edges in 8GB.

For larger networks, we recommend 16GB or more of RAM.

We have successfully analyzed networks with 250K or more edges on systems with 32GB of RAM.

Some steps may be taken to optimize the network and the analysis for large networks.

For example, skipping any network metrics that are not essential to the analysis can reduce the total time, cpu, and RAM needed to analyze the network.

Similarly, removing the use of image files for vertex shapes is another way to reduce the total amount of resources needed.

Also, segmenting the network (removing some portion, time slice, or sub-graph) can also reduce total data volumes and allow for analysis within available resources.


Marked as answer by MarcSmith on 7/5/2016 at 7:05 PM
Jun 13 at 2:32 PM
Thank you Marc! Your answer is very helpful.