Token usage missing
Nintorac opened this issue · 1 comments
Nintorac commented
https://platform.openai.com/docs/api-reference/chat/create
"usage": {
"prompt_tokens": 9,
"completion_tokens": 12,
"total_tokens": 21
}
This blocks langchain working seamlessly.
Maybe just dummy it for now? I didn't run into issues when disabling the dictionary access in langchain so probably doesn't hurt
lhenault commented
Yes I've not included this so far because it seemed pointless, as we shouldn't have to monitor token consumption. But if it's breaking pipelines in tools such as langchain
better have a dummy one.