does `llama-3.4.1-cuda12-linux-x86-64.jar` handle both CPU and GPU or only GPU?
Opened this issue · 0 comments
siddhsql commented
does llama-3.4.1-cuda12-linux-x86-64.jar handle both CPU and GPU or only GPU? i want to use GPU if available and if not fallback to CPU. could you share any code illustrating how to do this and what dependencies I need in order to do this?