Issues
- 2
mlm pretrain error
#65 opened by charliedream1 - 1
- 2
Cannot train new tokens while finetuning
#31 opened by sandeep-krutrim - 1
Are there reranker framework?
#63 opened by sigridjineth - 1
Generating hard negatives for BEIR datasets
#62 opened by dipamgoswami - 3
Dataset issues (EOF error)
#60 opened by dipamgoswami - 1
- 1
how can i get : tokenized_dataset: "nomic-ai/bert-pretokenized-2048-wiki-2023", for MLM training~
#57 opened by BeHappyForMe - 1
How to use different encoding functions for query and documents during evaluation?
#54 opened by dipamgoswami - 1
- 9
Install faiss
#52 opened by DanielMitiku - 2
- 1
Can we continue finetune pretrained bert model on MLM task instead of doing from scratch
#49 opened by AndrewNgo-ini - 6
- 2
Use of negatives during training
#48 opened by gangiswag - 6
- 1
How to use Huggingface Datasets?
#37 opened by ms337 - 1
How to implement "fill the entire batch with samples from that single source"
#39 opened by allhailzzq - 1
Filtering Data For Contrastive Pretraining
#43 opened by daegonYu - 2
- 1
Questions about learning rate settings
#38 opened by daegonYu - 0
A question: Is it possible to use supervised-learn task to finetune embedding model?
#35 opened by fengsxy - 1
- 5
- 3
Questions about Training Specifics
#33 opened by sukjunhwang - 1
- 3
- 5
- 2
Unable to save models in mlm pretraining
#15 opened by sandeep-krutrim - 2
- 2
protobuf vision problem
#18 opened by cao-hy23 - 3
AWS dataset issues
#19 opened by cao-hy23 - 3
Unable to get config in mlm pretraining
#16 opened by Linhvjc - 1
Regarding Further Training nomic-bert-2048
#13 opened by pjchungmd - 1
RuntimeError with Mixed Devices on GPU and CPU when Using Nomic Embed with SentEval
#14 opened by ZBWpro - 2
- 1
Adaptation for Computer Vision
#8 opened by vtrivedy - 1
Error in the `_grad_cache_forward_step` function when running suggested contrastive pretraining command
#9 opened by kuanhsieh - 0
Amazin2 know
#5 opened by Vin757 - 2
Problem related to nomic login
#1 opened by nioxinjiang3