Ollama support in AI Positron Enterprise
Posted: Fri Sep 06, 2024 3:13 pm
Hi,
my company has set up an Ollama server that features several LLMs, e.g. Mistral. Having learned that Ollama is partially comaptible with OpenAI API, I tried to connect to the Ollama server using the OpenAI connector in AI Positron Enterprise add-on - and although the authorization is successful, the response I get upon trying to perform an action using the add-on is HTTP 404.
However, in his response to this post feature-request/topic26840.html, Radu mentioned that it is possible to use the AI Positron Enterprise add-on with Ollama/Mistral.
So my questions are:
Is it possible to connect to an Ollama server (not running locally on localhost) with the OpenAI connector in AI Positron Enterprise? If so, what is the required config and is it possible to change the LLM used by the addon (as I mentioned before, the Ollama server features several LLMs).
And if it's not possible to use Ollama with the OpenAI connector, are you planning to support Ollama or other open source AI solutions in the future?
Thank you in advance.
Magda
my company has set up an Ollama server that features several LLMs, e.g. Mistral. Having learned that Ollama is partially comaptible with OpenAI API, I tried to connect to the Ollama server using the OpenAI connector in AI Positron Enterprise add-on - and although the authorization is successful, the response I get upon trying to perform an action using the add-on is HTTP 404.
However, in his response to this post feature-request/topic26840.html, Radu mentioned that it is possible to use the AI Positron Enterprise add-on with Ollama/Mistral.
So my questions are:
Is it possible to connect to an Ollama server (not running locally on localhost) with the OpenAI connector in AI Positron Enterprise? If so, what is the required config and is it possible to change the LLM used by the addon (as I mentioned before, the Ollama server features several LLMs).
And if it's not possible to use Ollama with the OpenAI connector, are you planning to support Ollama or other open source AI solutions in the future?
Thank you in advance.
Magda