set numerical separator in result-modules output file to 1.0
Closed this issue · 9 comments
See section OUTPUT of README file: "the second value is an arbitrary numerical value and should be ignored".
- for K1 it is set to: "1.0"
- for M1 it is set to: "1.0"
- for R1 it is set to: "0.5"
I think we should keep it, to avoid changes in the scoring pipeline, but it would be nice to set it to the same value, say: "0.0"
According to the challenge, that value was added to have the possibility of assigning our level of confidence to each community. Using 0.0 as default does not seem the best idea, it does not respects the semantics (even if all of us have decided to set it to a constant); I would vote for 1.0.
Changing this value for M1 is trivial, so this should not be the problem:
change line 1039
f.write("%d\t1.0\t" % comm_id)
of .src/M1_code/sub-challenge1/aleph_urv_method.py to
f.write("%d\t0.0\t" % comm_id)
Uh, I had missed the meaning of that value completely!
@WeijiaZhang24, @sergio-gomez, @jjc2718: Let's vote!
- should all methods output 1.0?
- is it easy to derive module-level confidence scores?
I've voted for 1.0 :-)
I do not really know any simple method to assign confidence scores to the modules, apart from their contribution to global modularity (although they are size-dependent, and also resolution dependent, so I would avoid them).
Right, then this is already the end of the discussion.
@WeijiaZhang24: could you simply set this value to 1.0 in R1 as well?
TODO: Mattia update README once this is done.
I would say we only need to substitute line 256 in file
R1_code/sub-challenge1/mlrmcl_r4_v2.R
from
temp<-c(i,0.5,output[[i]])
to
temp<-c(i,1.0,output[[i]])
Mattia, try it and, if works, let's close this issue.
I have submit a commit a few days ago and changed the value to 1.0
You're right, it seems the merge is pending
Thank you!