NousResearch/Hermes-Function-Calling

Can't start function calling script on a linux machine

Opened this issue · 1 comments

Here's the Error:

python functioncall.py --query "I need the current stock price of Tesla (TSLA)"
                                                                                                         dP       
                                                                                                         88       
      88d888b. .d8888b. dP    dP .d8888b. 88d888b. .d8888b. .d8888b. .d8888b. .d8888b. 88d888b. .d8888b. 88d888b. 
      88'  `88 88'  `88 88    88 Y8ooooo. 88'  `88 88ooood8 Y8ooooo. 88ooood8 88'  `88 88'  `88 88'  `"" 88'  `88 
      88    88 88.  .88 88.  .88       88 88       88.  ...       88 88.  ... 88.  .88 88       88.  ... 88    88 
      dP    dP `88888P' `88888P' `88888P' dP       `88888P' `88888P' `88888P' `88888P8 dP       `88888P' dP    dP 
                                                                                                                  
                                                                                                                  

2024-06-25:17:03:54,055 INFO     [functioncall.py:25] None
Traceback (most recent call last):
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1535, in _get_module
    return importlib.import_module("." + module_name, self.__name__)
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/transformers/models/llama/modeling_llama.py", line 54, in <module>
    from flash_attn import flash_attn_func, flash_attn_varlen_func
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/flash_attn/__init__.py", line 3, in <module>
    from flash_attn.flash_attn_interface import (
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/flash_attn/flash_attn_interface.py", line 10, in <module>
    import flash_attn_2_cuda as flash_attn_cuda
ImportError: /home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104impl3cow11cow_deleterEPv

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/acorn/rubra-benchmark/Hermes-Function-Calling/functioncall.py", line 178, in <module>
    inference = ModelInference(model_path, args.chat_template, args.load_in_4bit)
  File "/home/acorn/rubra-benchmark/Hermes-Function-Calling/functioncall.py", line 35, in __init__
    self.model = AutoModelForCausalLM.from_pretrained(
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 562, in from_pretrained
    model_class = _get_model_class(config, cls._model_mapping)
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 383, in _get_model_class
    supported_models = model_mapping[type(config)]
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 734, in __getitem__
    return self._load_attr_from_module(model_type, model_name)
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 748, in _load_attr_from_module
    return getattribute_from_module(self._modules[module_name], attr)
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 692, in getattribute_from_module
    if hasattr(module, attr):
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1525, in __getattr__
    module = self._get_module(self._class_to_module[name])
  File "/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1537, in _get_module
    raise RuntimeError(
RuntimeError: Failed to import transformers.models.llama.modeling_llama because of the following error (look up to see its traceback):
/home/acorn/anaconda3/envs/hermes/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN3c104impl3cow11cow_deleterEPv

My Machine Configs:

  • Ubuntu 22.04 OS
  • Nvidia-4090 with cuda 12.2
  • nvidia drivers 535
  • python 3.10

Looks like a flash attention issue, try uninstalling it and installing with pip install --no-build-isolation flash-attn and see if it works