ashleve/lightning-hydra-template
PyTorch Lightning + Hydra. A very user-friendly template for ML experimentation. ⚡🔥⚡
Python
Issues
- 0
Error loading path
#640 opened by urafrik - 0
Problems with DDP + Optuna
#639 opened by Phimos - 0
- 5
please support dynamically selecting config_name
#632 opened by YChienHung - 0
- 0
RuntimeError when using profiler.
#636 opened by beatwad - 2
Override hydra config group with none/null
#629 opened by Tomakko - 0
How to save images in testing_setp ?
#631 opened by YChienHung - 1
- 0
Adjust pre-commit mdformat for inline math
#625 opened by alexkrz - 0
- 0
Issue with "best ckpt" absolute paths during fine-tuning and continue training
#621 opened by Mai0313 - 0
locate saved checkpoints from mlfow run id
#620 opened by AzharSindhi - 1
Couldn't log learning rate
#618 opened by amirshamaei - 1
Logging step being called twice?
#617 opened by cmvcordova - 0
Omegaconf dict that gets logged is not resolved
#615 opened by iprapas - 0
Fixed seeding before instantiation of mlflow logger causes identical run names
#613 opened by ChristianHinge - 1
- 0
【Report and Requests】I created LLMFlowOptimizer repo, strongly inspired by lightning-hydra-template.
#610 opened by Yongtae723 - 1
Adding several loss (criteria)
#609 opened by amirshamaei - 1
Couldn't log with wandb
#607 opened by amirshamaei - 1
Suggestion with Solution about loss function
#605 opened by Mai0313 - 3
- 3
How to log the results of any run
#580 opened by sgalpha01 - 1
export onnx will add Identity operator
#602 opened by zhuyuanxiang - 1
- 0
- 0
`trainer.callback_metrics` included in `metric_dict` after training doesn't make sense
#597 opened by libokj - 1
Error when using a logger
#578 opened by pemeunier - 0
integrate with Lightning Fabric?
#593 opened by shuowang-ai - 1
- 1
- 0
- 0
WandB configs not logged properly
#582 opened by dreaquil - 4
LR Schedulers compatibility
#562 opened by coenvdgrinten - 3
- 1
Is it possible to use the wandb runtime generated run id or name in hydra config?
#574 opened by RuishengSu - 1
Using OneCycleLR scheduler
#566 opened by phum1901 - 0
- 2
Capturing the available device output
#567 opened by matthewcarbone - 2
wandb watch parameters and gradients
#560 opened by astrocyted - 0
cannot use "trainer=ddp"
#563 opened by a4152684 - 0
Possibility to use Lightning Flash
#559 opened by hjander - 4
- 0
- 0
Scale invariance test
#555 opened by reubenwenisch - 1
Update to PyTorch Lightning 2.0
#547 opened by 5c4lar - 3
Resume multi-run
#535 opened by nurlanov-zh - 2
Question: `_self_` at the end of `defaults` list
#544 opened by shenoynikhil - 1
Does it support wandb sweep for hyperparameter search and model optimization?
#531 opened by zf223669