Krazal/nppopenai

Does it support the Ollama service?

Leroy-X opened this issue · 3 comments

Does this plugin support Ollama?

I attempted to modify the URL to http://localhost:11434/v1/chat/completions, but it failed to function.

What should I do? Thank you immensely.

Hi!

Thank you for your question and your patience!

Unfortunately the plugin NppOpenAI doesn't support Ollama directly, but you can use a PHP proxy to handle NppOpenAI requests and Ollama responses. For example you can create an ollama_proxy.php:

<?php
header('Content-Type: application/json; charset=utf-8');

// Set up some variables
$ollama_url = 'http://localhost:11434/v1/chat/completions';
$postfields = [ # 'prompt' and (optional) 'system' will be added later
	'model'   => 'latest', # PLEASE SET UP MODEL HERE! (additional models: 'llama2', 'orca-mini:3b-q4_1', 'llama2:13b' etc.)
	'stream'  => false,
	'options' => [
		'temperature' => 0.7, # Default -- will be overwritten by NppOpenAI.ini
		'top_k '      => 40,  # Default value
		'top_p'       => 0.9, # Default -- will be overwritten by NppOpenAI.ini
		// For additional options see: https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values
	],
];

// Let's get started
try {

	// Check PHP Input
	$input = json_decode(file_get_contents('php://input'), true);
	if (!$input)
	{
		throw new \Exception("Non-JSON input received");
	}

	// Check system/prompt related elements
	$is_system_message = (@$input['messages'][0]['role'] == 'system');
	if (!isset($input['messages'][0]['content']))
	{
		throw new \Exception("No message received");
	}
	if ($is_system_message && !isset($input['messages'][1]['content']))
	{
		throw new \Exception("No message received, only instructions (system message)");
	}

	// Add system message
	if ($is_system_message)
	{
		$postfields['system'] = $input['messages'][0]['content'];
	}

	// Add prompt
	$postfields['prompt'] = !$is_system_message
		? $input['messages'][0]['content']
		: $input['messages'][1]['content'];

	// Update some options, if possible
	// $postfields['model']               = $input['model']       ?? $postfields['model']; # Use the model above to support system messages
	$postfields['options']['temperature'] = $input['temperature'] ?? $postfields['options']['temperature'];
	$postfields['options']['top_p']       = $input['top_p']       ?? $postfields['options']['top_p'];

	// Call Ollama
	$ch = curl_init();
	curl_setopt($ch, CURLOPT_URL, $ollama_url);
	curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 0); # OK on localhost
	curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0); # OK on localhost
	curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); # It may come in handy
	curl_setopt($ch, CURLOPT_MAXREDIRS, 10);     # It may come in handy
	curl_setopt($ch, CURLOPT_TIMEOUT, 60);       # Increase if necessary
	curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); # Required for output
	curl_setopt($ch, CURLOPT_POST, 1);
	curl_setopt($ch, CURLOPT_POSTFIELDS, json_encode($postfields));
	$curl_data   = curl_exec($ch);
	$curl_errno = curl_errno($ch);
	if ($curl_errno)
	{
		$curl_err = curl_error($ch) ?: $curl_errno;
		curl_close($ch);
		throw new \Exception("CURL error: {$curl_err}");
	}

	// Handle Ollama's response
	$response = json_decode($curl_data, true);
	if (!$response)
	{
		throw new \Exception("Non-JSON response received: {$curl_data}");
	}
	if (!isset($response['response']))
	{
		throw new \Exception("Missing response; Ollama's answer: " . print_r($response, true));
	}

	// Convert/Print output
	$output = [
		'usage'   => [
			'total_tokens' => (int)@$response['prompt_eval_count'] + (int)@$response['eval_count'], # Total token usage
		],
		'choices' => [ [
			'message' => [
				'content'       => $response['response'],
				'finish_reason' => 'stop',
			],
		] ],
	];
	echo json_encode($output);

// Handle errors
} catch (Exception $e) {
	echo json_encode([
		'error' => [
			'message' => $e->getMessage(),
		],
	]);
}

?>

You may install a WAMPServer and save the script above to C:\wamp64\www\ollama_proxy.php or similar directory.

Then you should update the API URL in NppOpenAI.ini (Plugins » NppOpenAI » Edit Config), like:

api_url=http://localhost/ollama_proxy.php?endpoint=
temperature=0.7
top_p=0.8

Please leave the model unchanged (e.g. model=gpt-4), and update this in the ollama_proxy.php file if necessary (look for the 'latest' keyword).

After saving the NppOpenAI.ini please don't forget to click Plugins » NppOpenAI » Load Config.

The script above should work, but it has been tested in simulated environment only, without Ollama. If you have any question, feel free to write a new Comment.

Thank you very much for your reply!