netdur/llama_cpp_dart

Help. Building llama.cpp Library

pavelprosto94 opened this issue · 11 comments

Please tell us in more detail how to compile llama.cpp?
Which files should I copy and where should I copy them(I didn't find any *.so files)?

I try do this:

cd ~/.pub-cache/hosted/pub.dev/llama_cpp_dart-0.0.4
git clone https://github.com/ggerganov/llama.cpp.git
mkdir build
cd build
cmake .. -DLLAMA_BLAS=ON -DLLAMA_BLAS_VENDOR=OpenBLAS
cmake --build . --config Release

what platform? generally add -DBUILD_SHARED_LIBS=ON to cmake ..

platform Linux, Flutter project

the compilation was successful and I received 2 files - libggml_shared.so and libllama.so
And to which directory should they be copied?

place libllama.so in root folder of your project

error on click load model

Launching lib/main.dart on Linux in debug mode...

(boris:88419): Gdk-CRITICAL **: 22:21:13.306: gdk_window_get_state: assertion 'GDK_IS_WINDOW (window)' failed
Connecting to VM Service at ws://127.0.0.1:42281/GRcPMlo5nE8=/ws
[ERROR:flutter/runtime/dart_isolate.cc(1097)] Unhandled exception:
Invalid argument(s): Failed to lookup symbol 'llama_backend_init': /home/prosto/Sync/Projects/Flutter/boris/build/linux/x64/debug/bundle/lib/libflutter_linux_gtk.so: undefined symbol: llama_backend_init
#0      DynamicLibrary.lookup (dart:ffi-patch/ffi_dynamic_library_patch.dart:33:70)
#1      llama_cpp._llama_backend_initPtr (package:llama_cpp_dart/src/llama_cpp.dart:10187:63)
#2      llama_cpp._llama_backend_initPtr (package:llama_cpp_dart/src/llama_cpp.dart)
#3      llama_cpp._llama_backend_init (package:llama_cpp_dart/src/llama_cpp.dart:10190:7)
#4      llama_cpp._llama_backend_init (package:llama_cpp_dart/src/llama_cpp.dart)
#5      llama_cpp.llama_backend_init (package:llama_cpp_dart/src/llama_cpp.dart:10181:12)
#6      new Llama (package:llama_cpp_dart/src/llama.dart:63:9)
#7      LlamaProcessor._modelIsolateEntryPoint.<anonymous closure> (package:llama_cpp_dart/src/llama_processor.dart:89:21)
#8      _RootZone.runUnaryGuarded (dart:async/zone.dart:1594:10)
#9      _BufferingStreamSubscription._sendData (dart:async/stream_impl.dart:339:11)
#10     _BufferingStreamSubscription._add (dart:async/stream_impl.dart:271:7)
#11     _SyncStreamControllerDispatch._sendData (dart:async/stream_controller.dart:784:19)
#12     _StreamController._add (dart:async/stream_controller.dart:658:7)
#13     _StreamController.add (dart:async/stream_controller.dart:606:5)
#14     _RawReceivePort._handleMessage (dart:isolate-patch/isolate_patch.dart:184:12)

@pavelprosto94, instead of spending time figuring out where Dart searches for shared libraries, I've introduced a libraryPath property to the Llama class. This allows for setting the full path of the library directly.
Llama.libraryPath = "...";

@pavelprosto94, please feel free to reopen the issue if the proposed solution doesn't resolve your problem.

Hi @netdur
You can build llama.cpp as a shared library as part of the dart build process, as i have done in maid:

https://github.com/Mobile-Artificial-Intelligence/maid

This works for Linux, Windows and Linux but not MacOS and IOS. I can help you implement this logic if you'd like. You package seems like it may be able to replace my own implementation.

@pavelprosto94 potentially v0.0.6 may resolve your issue.

@pavelprosto94 did you successfully build this project? could you overcome #33 ?