Page 1 of 1

Mistral support in AI Positron

Posted: Fri Mar 20, 2026 4:55 pm
by pb
Hi,
I'm experimenting with AI coding assistance. I notice all the models that AI positron natively supports seem to be models from the usual US Big Tech companies. But I really don't like bringing my money to companies that are far too powerful already. I'd very much appreciate if you also made available other (European) options, such as Mistral.
Thank you for considering this.
Peter

Re: Mistral support in AI Positron

Posted: Fri Mar 20, 2026 7:01 pm
by Radu
Hello Peter,

AI Positron can connect to an LLM in two ways:

1) Using our AI Positron Service it can connect to Claude, GPT, Google Gemini engines.
https://www.oxygenxml.com/doc/ug-positr ... b3_kmk_cxb
2) You can define a connector in the Preferences->"Plugins / Oxygen AI Positron / AI Service Configuration" page and using a connector type like "OpenAI" you can connect to an engine like Ollama which runs Mistral or Qwen or some other free LLM:
https://www.oxygenxml.com/doc/ug-positr ... br_ch3_fhc

When I tested using AI Positron with Mistral Large 2.1 some time ago I did that by connecting AI Positron to an Ollama server running locally on my machine. From what I noted back then Mistral Large 2.1 did not have vision support (so it could not interpret attached images), it had a smaller 128k tokens window and it was not that great for vibe writing, for expressing an indention in the chat window and then have the LLM use tools to take it to completion. But Mistral 2.1 was a big step forward comparing to an older Mistral LLM I tested before it. I see now a Mistral 3 is available, I have not tested it, I assume it will be better.