BatchLama allows you to communicate with LLMs locally via Ollama in Batch! Example:
BatchLama.exe "deepseek-r1:7b, Your answer mustn't be more then 50 words, When GitHub was created"Put model name first, then system prompt and then your prompt.
Important
NEVER, I said NEVER use "," cuz this is how C# code(source) works:
var input = args[0];
var words = input.Split(',');
try
{
_modelName = words[0];
_systemPrompt = words[1];
_userPrompt = words[2];
}
catch
{
Console.WriteLine("Incorrect syntax or amount of arguments!");
return;
}This project is licensed under the MIT License.
See LICENSE for full terms.
HardCodeDev
💬 Got feedback, found a bug, or want to contribute? Open an issue or fork the repo on GitHub!