truera/trulens

Runtime Error on Langchain Quickstart

Closed this issue · 2 comments

I've been following the Langchain Quickstart to set it up for my ChromaDB, but I get the Runtime Error.
Why does this happen?

Feedback Function exception caught: Traceback (most recent call last):
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/trulens_eval/utils/asynchro.py", line 120, in sync
    loop = asyncio.get_running_loop()
RuntimeError: no running event loop

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/trulens_eval/feedback/feedback.py", line 511, in run
    result_and_meta, part_cost = sync(
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/trulens_eval/utils/asynchro.py", line 125, in sync
    return loop.run_until_complete(awaitable)
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/nest_asyncio.py", line 98, in run_until_complete
    return f.result()
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/asyncio/futures.py", line 201, in result
    raise self._exception
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/asyncio/tasks.py", line 256, in __step
    result = coro.send(None)
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/trulens_eval/feedback/provider/endpoint/base.py", line 466, in atrack_all_costs_tally
    result, cbs = await Endpoint.atrack_all_costs(
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/trulens_eval/feedback/provider/endpoint/base.py", line 447, in atrack_all_costs
    return await Endpoint._atrack_costs(
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/trulens_eval/feedback/provider/endpoint/base.py", line 544, in _atrack_costs
    result: T = await desync(__func, *args, **kwargs)
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/trulens_eval/utils/asynchro.py", line 96, in desync
    res = await asyncio.to_thread(func, *args, **kwargs)
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/asyncio/threads.py", line 25, in to_thread
    return await loop.run_in_executor(None, func_call)
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/asyncio/base_events.py", line 819, in run_in_executor
    executor.submit(func, *args), loop=self)
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/trulens_eval/utils/threading.py", line 78, in submit
    return super().submit(
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/concurrent/futures/thread.py", line 169, in submit
    raise RuntimeError('cannot schedule new futures after '
RuntimeError: cannot schedule new futures after interpreter shutdown

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/luka/miniconda3/envs/sample_project/lib/python3.9/site-packages/trulens_eval/feedback/feedback.py", line 516, in run
    raise RuntimeError(
RuntimeError: Evaluation of groundedness_measure_with_cot_reasons failed on inputs:
{'source': [[]],
 'statement': 'Unfortunatley from the context above I cannot'
              'provide additional infor'
cannot schedule new futures after interpreter shutdown.

Versions I'm using:
trulens-eval==0.23.0
langchain==0.1.4

Hi @pavluka6 . Are you running this from a python file that invokes the example langchain app and then quits immediately after? If so, you can wait until feedback is done computing by adding this:

record.wait_for_feedback_results()

Where record is the record produced by trulens tracking your app's invocation.

Alternatively you can run the notebook version in the .ipynb .

Yes, I'm running this from a Python file that invokes the example langchain app, and using it through the notebook doesn't apply to my use case.

However, I've followed the updated version of the langchain quickstart from the docs and it seems to be working now