dandelionsllm/pandallm
Panda项目是于2023年5月启动的开源海外中文大语言模型项目,致力于大模型时代探索整个技术栈,旨在推动中文自然语言处理领域的创新和合作。
PythonApache-2.0
Issues
- 0
在线体验 pandallm.ai/,该链接无法打开
#35 opened by Etttz - 0
在线体验需要访问密码?
#34 opened by wanghx629 - 3
Panda-LLaMA2-13B的模型链接不对吧?
#33 opened by sun1092469590 - 11
模型合并推理结果不正常
#5 opened by xxxxuee - 0
- 2
RuntimeError: The size of tensor a (32000) must match the size of tensor b (32001) at non-singleton dimension 0
#31 opened by jeinlee1991 - 5
关于推理的问题
#30 opened by Jiangchenglin521 - 2
由于训练数据的预处理存在不确定性,请问可以进一步分享处理代码吗
#29 opened by Cascol-Chen - 5
如果要重新训练panda,请问数据集应该以什么结构存储呢
#28 opened by Cascol-Chen - 1
能否提供一下模型量化的方法?其他在mac上进行推理测试
#27 opened by doctor1984 - 4
商用7b模型,直接进行推理说话毫无逻辑,难道不是开箱即用的嘛?
#26 opened by Goo-goo-goo - 12
单卡A100训练7B模型OOM
#25 opened by HuihuiChyan - 8
请问如何在COIG数据集上继续训练
#23 opened by scarydemon2 - 1
如何将COIG数据集处理为instruction tuning的格式
#24 opened by HuihuiChyan - 5
很值得期待的新项目,加油。
#7 opened by imgingroot - 2
权重转换脚本不支持来自 Facebook 的原版权重
#22 opened by ShadowPower - 1
- 1
请问有使用的examples吗
#21 opened by Mr-IT007 - 1
词表没扩充过是吗?还是32,000吧?
#19 opened by mellivoraPKU - 1
- 1
预训练和微调阶段使用的是全量微调还是lora?
#18 opened by mellivoraPKU - 1
感觉模型比原始的LlaMa模型大了很多,是什么原因
#16 opened by uloveqian2021 - 8
13b的结果,很奇怪
#15 opened by frog-game - 1
模型合并后的MD5?
#13 opened by hejujie - 1
response prefixes and renewable energy
#14 opened by stakodiak - 4
请问4090单卡可以完成7B模型的训练吗
#11 opened by mircop1t - 10
- 1
- 2
推理的效果很奇怪,能否帮忙看一下原因
#9 opened by ccyhxg - 3
有做和ChatGLM对比的评测吗
#6 opened by lanyuer - 7
- 1
关于评估选用的BELLE baseline问题
#4 opened by hejujie - 1
llama.cpp support
#3 opened by lucasjinreal - 1
RuntimeError: The size of tensor a (32000) must match the size of tensor b (32001) at non-singleton dimension 0
#1 opened by gqjia