boelukas/mariner

Make artifacts (model, dataset) available on Hugging Face

Opened this issue · 0 comments

Hi @boelukas,

Niels here from the open-source team at Hugging Face. I discovered your work through ECCV, I indexed your paper: https://huggingface.co/papers/index?arxivId=2407.13745. Congrats on getting it accepted! I work together with AK on improving the visibility of researchers' work on the hub.

It'd be great to make the checkpoints and data available on the 🤗 hub, rather than Google Drive, to improve their discoverability/visibility. We can add tags so that people find them when filtering https://huggingface.co/models.

Uploading models

See here for a guide: https://huggingface.co/docs/hub/models-uploading.

In this case, we could leverage the PyTorchModelHubMixin class which adds from_pretrained and push_to_hub to any custom nn.Module. Alternatively, one can leverages the hf_hub_download one-liner to download a checkpoint from the hub.

We encourage researchers to push each model checkpoint to a separate model repository, so that things like download stats also work. We can then also link the checkpoints to the paper page.

Uploading dataset

Would be awesome to make the training dataset available on 🤗 , so that people can do:

from datasets import load_dataset

dataset = load_dataset("your-hf-org/your-dataset")

See here for a guide: https://huggingface.co/docs/datasets/image_dataset

Besides that, there's the dataset viewer which allows people to quickly explore the first few rows of the data in the browser.

Let me know if you're interested/need any help regarding this!

Cheers,

Niels
ML Engineer @ HF 🤗