ReScience/submissions

[Re] Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks

HATON-R opened this issue ยท 37 comments

Original article:
S. Kumar, X. Zhang and J. Leskovec. Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks. In: Proceedings of the 25th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM. 2019.

PDF URL:
https://github.com/ComplexNetTSP/JODIE-RESCIENCE/blob/master/article.pdf
Metadata URL:
https://github.com/ComplexNetTSP/JODIE-RESCIENCE/blob/master/metadata.yaml
Code URL:
https://github.com/ComplexNetTSP/JODIE

Scientific domain:
Machine Learning
Programming language:
Python
Suggested editor:

Thanks for your submission. We'll assign an editor soon.

@gdetor @koustuvsinha Can one of you edit this submission?

@rougier I could handle this.

Great, thank you! I've assigned you as editor.

Hi @ogrisel, would you be able to review this submission?

@rougier Could I assign as reviewer someone off the reviewer's list?

Yes of course. If they accept and want to appear in the board, just tell me. You can also ask all reviewers at once using the @ ReScience / reviewers notification (without space, I don't want to broadcast here)

Hi @ghost-nn-machine would you be able to review this submission?

Hi @benureau Could you handle this review?

Hi @koustuvsinha would you be willing to review this submission?

Hi @neuronalX could you handle the review of this submission?

Hi @gdetor, thank you for the offer, but I am already too busy for the following month.

Hey @gdetor, I can handle this.

gdetor commented

Hi @damiendr Would you be available to review this submission?

gdetor commented

HI @hkashyap
Could you handle this review?
@ghost-nn-machine Gentle reminder

@gdetor I can review this submission.

gdetor commented

Thank you @hkashyap I'll assign you as a reviewer.

gdetor commented

Hi @hkashyap and @ghost-nn-machine Any updates?

@gdetor I will need more time, I plan to submit the review by 10/30.

Gentle reminder.

gdetor commented

Hi @hkashyap @ghost-nn-machine Any progress?

I trust this message finds you well. I am writing to request an update on the manuscript I submitted for review over a year ago today. Might you be able to provide me with some insight into the current status of the review process?

Perhaps we could consider the revisions as a symbolic birthday present for my article?

@HATON-R Very sorry for being so late in the review. I'll try to make things move forward.

@ReScience/reviewers Help needed for reviewing a paper machine learning/Python ! See #70

@HATON-R Don't hesitate to remind us here we're late. We have not yet an automated process for tracking submission (but we'll soon have)

I can review this as well. @rougier

@HaoZeke Thank you! You can start the review then. If you can do it in less than two weeks that would be wonderful.

@ghost-nn-machine Are you still available to do the review?

@HaoZeke Thank you! You can start the review then. If you can do it in less than two weeks that would be wonderful.

Sure, I'll try to get it done this weekend.

@hkashyap Can you update us on your review (just tell us if you can't do it such that we start looking for another reviewer)

@hkashyap @HaoZeke Any progress?

@HaoZeke Any update?

@HaoZeke @hkashyap Any updates?

@HATON-R @gdetor Maybe we need to find other reviewers. I can make a review and maybe @gdetor you can do the second one. Let's give us two weeks?

@rougier That works for me

@HATON-R @rougier Here is my review.

Overall, the work shows that one can replicate the main results of the original. Moreover, the authors go the extra mile and show how the model's basic hyperparameters affect its performance.

Text

  • Although the authors refer to slightly different conclusions than those of the original article, they do not mention them in the summary.
  • The mathematical symbol for the Haramard product is $\odot$.
  • In the t-batch section, the authors mention that a batch cannot contain the same entity several times. Please be more specific.
  • The two paragraphs "When we want to predict ... cross-entropy expressed in the following equation:" on page 7 can be merged. The authors repeat themselves.
  • One can mistake the number of classes C for the number of t-batches in algorithm 1.
  • In the first paragraph of Section 3.5, the authors repeat themselves. Please rephrase that paragraph.
  • When presenting their results, it might be better if the authors used the same format for the tables as in the original article. They can add their results in bold font and as a row in the table.
  • The authors claim their results are marginally worse than the ones reported in the original work. However, it's not marginal because, for instance, the RRN performs better on the Reddit and Wikipedia datasets.

Source Code

Unfortunately, due to Ray incompatibility, I couldn't run the code and verify the results. The authors have used an older version of Ray, so please update it or impose the exact version in the requirements.txt.

  • Please add a LICENSE file
  • The code is not documented; please add some essential documentation, at least in the main components of the source code
    The source code should follow Python's PEP-8 protocol. The autopep-8 tool can help you make your scripts comply.
  • There are imported packages that are not used in the code (e.g., ray tune, ray.air in file train.py)
  • The code does not run on Pytorch 2.0 (the authors should fix that since many people use that version instead of the 1.10)
  • Did the authors try to use any specialized search algorithm with Ray Tune, such as Optuna, or do they rely only on Ray's default algorithms?
  • The requirement for Torch is broken. Use the -i link and then the name of the package.

@rougier Gentle reminder