I\'m studying about detection communities in networks.
I\'m use igraph and Python
For the optimal number of communities in terms of the modularity measure:
Perhaps I am misunderstanding you, but if you would like the number of communities output by the NetworkX implementation of the best_partition algorithm, just note that best_partition(G) gives a dictionary with nodes as keys and their partition number as value.
You can count the number of unique values in a dictionary like this (likely not optimal):
dict = {'a':1,'b':1,'c':2,'d':1,'e':3,'f':4,'g':5}
count=list(set([i for i in dict.values()]))
print count
print len(count)
With result
[1, 2, 3, 4, 5]
5
I'm also new to networkx and igraph, I used Gephi, an data visualization tool/software. And it has the same community detection algorithm as the one in networkx you are now using. Specifically, in http://perso.crans.org/aynaud/communities/
It uses the louvain method described in Fast unfolding of communities in large networks, Vincent D Blondel, Jean-Loup Guillaume, Renaud Lambiotte, Renaud Lefebvre, Journal of Statistical Mechanics: Theory and Experiment 2008(10), P10008 (12pp)
You can not get desired number of communities, as I know, there're two ways worth to try:
resolution
that would change the size of the community you get. best_partition(G)
any more. But use partition_at_level(dendrogram, level)
, I guess this might help.Check the source code here for more info.